Skip to content

Commit 0c19a4d

Browse files
committed
Cut new release 3.3.2
Co-authored-by: Isaac Signed-off-by: Madhavendra Rathore <madhavendra.rathore@databricks.com>
1 parent 75d13a2 commit 0c19a4d

14 files changed

Lines changed: 66 additions & 33 deletions

File tree

CHANGELOG.md

Lines changed: 24 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,29 @@
11
# Version Changelog
22

3+
## [v3.3.2] - 2026-04-27
4+
5+
### Added
6+
- Added `CallableStatement` support with IN parameters. `Connection.prepareCall()` now returns a working `DatabricksCallableStatement` that supports positional parameter binding and execution via `{call proc(?)}` JDBC escape syntax. OUT/INOUT parameters and named parameters throw `SQLFeatureNotSupportedException`.
7+
- Added AI coding agent detection to the User-Agent header. When the driver is invoked by a known AI coding agent (e.g. Claude Code, Cursor, Gemini CLI), `agent/<product>` is appended to the User-Agent string.
8+
9+
### Updated
10+
- Added support for using SQL SHOW commands for Thrift-mode metadata operations (`getTables`, `getColumns`, `getSchemas`, `getFunctions`, `getPrimaryKeys`, `getImportedKeys`, `getCrossReference`). Enable by setting `UseQueryForMetadata=1`. This aligns Thrift metadata behavior with Statement Execution API (SEA) mode.
11+
12+
### Fixed
13+
- Improved error messages for cancelled statements: operations cancelled via `Statement.cancel()` or closed connections now return SQL state `HY008` (operation cancelled) instead of generic error codes, making it easier for applications to detect and handle cancellations.
14+
- Fixed race condition between chunk download error handling and result set close that could cause invalid state transition warnings (`CHUNK_RELEASED -> DOWNLOAD_FAILED`) during Arrow Cloud Fetch operations in resource-constrained environments.
15+
- Fixed `EnableBatchedInserts` silently falling back to individual execution when table or schema names contain special characters (e.g., hyphens) inside backtick-quoted identifiers. Added a warn log when the fallback occurs.
16+
- Fixed `IntervalConverter` crash (`IllegalArgumentException: Invalid interval metadata`) when INTERVAL columns are returned via CloudFetch. Arrow metadata from CloudFetch uses underscored format (`INTERVAL_YEAR_MONTH`, `INTERVAL_DAY_TIME`) which the driver's regex did not accept.
17+
- Fixed `Statement` being prematurely closed after queries that return inline results, which prevented re-execution, `getResultSet()`, and `getExecutionResult()` from working. Statements now remain open and reusable until explicitly closed by the caller.
18+
- Fixed primitive types within complex types (ARRAY, MAP, STRUCT) not being correctly parsed when Arrow serialization uses alternate formats: TIMESTAMP/TIMESTAMP_NTZ as epoch microseconds or component arrays, and BINARY as base64-encoded strings.
19+
- Fixed `PARSE_SYNTAX_ERROR` for column names containing special characters (e.g., dots) when `EnableBatchedInserts` is enabled, by re-quoting column names with backticks in reconstructed multi-row INSERT statements.
20+
- Fixed Volume ingestion for SEA mode, which was broken due to statement being closed prematurely.
21+
- Fixed unclear `error: [null]` messages during transient HTTP failures (e.g. 502 Bad Gateway) in Thrift polling. Error messages now include server error details and use SQL state `08S01` (communication link failure) so callers can identify retryable errors. Also fixed `DatabricksError` (RuntimeException) from SDK client being unhandled in CloudFetch download paths.
22+
- Fixed escaped pattern characters in catalogName for `getSchemas`, as returned catalogName should be unescaped.
23+
- Fixed `getColumnClassName()` returning null for VARIANT columns in SEA mode by adding VARIANT to the type system.
24+
- Fixed `getColumns()` returning `DATA_TYPE=0` (NULL) for GEOMETRY/GEOGRAPHY columns in Thrift mode. Now returns `Types.VARCHAR` (12) when geospatial is disabled and `Types.OTHER` (1111) when enabled, consistent with SEA mode.
25+
- Fixed `getCrossReference()` returning 0 rows when parent args are passed in uppercase. The client-side filter used case-sensitive comparison against server-returned lowercase names.
26+
327
## [v3.3.1] - 2026-03-17
428

529
### Added

NEXT_CHANGELOG.md

Lines changed: 0 additions & 16 deletions
Original file line numberDiff line numberDiff line change
@@ -3,26 +3,10 @@
33
## [Unreleased]
44

55
### Added
6-
- Added `CallableStatement` support with IN parameters. `Connection.prepareCall()` now returns a working `DatabricksCallableStatement` that supports positional parameter binding and execution via `{call proc(?)}` JDBC escape syntax. OUT/INOUT parameters and named parameters throw `SQLFeatureNotSupportedException`.
7-
- Added AI coding agent detection to the User-Agent header. When the driver is invoked by a known AI coding agent (e.g. Claude Code, Cursor, Gemini CLI), `agent/<product>` is appended to the User-Agent string.
86

97
### Updated
10-
- Added support for using SQL SHOW commands for Thrift-mode metadata operations (`getTables`, `getColumns`, `getSchemas`, `getFunctions`, `getPrimaryKeys`, `getImportedKeys`, `getCrossReference`). Enable by setting `UseQueryForMetadata=1`. This aligns Thrift metadata behavior with Statement Execution API (SEA) mode.
118

129
### Fixed
13-
- Improved error messages for cancelled statements: operations cancelled via `Statement.cancel()` or closed connections now return SQL state `HY008` (operation cancelled) instead of generic error codes, making it easier for applications to detect and handle cancellations.
14-
- Fixed race condition between chunk download error handling and result set close that could cause invalid state transition warnings (`CHUNK_RELEASED -> DOWNLOAD_FAILED`) during Arrow Cloud Fetch operations in resource-constrained environments.
15-
- Fixed `EnableBatchedInserts` silently falling back to individual execution when table or schema names contain special characters (e.g., hyphens) inside backtick-quoted identifiers. Added a warn log when the fallback occurs.
16-
- Fixed `IntervalConverter` crash (`IllegalArgumentException: Invalid interval metadata`) when INTERVAL columns are returned via CloudFetch. Arrow metadata from CloudFetch uses underscored format (`INTERVAL_YEAR_MONTH`, `INTERVAL_DAY_TIME`) which the driver's regex did not accept.
17-
- Fixed `Statement` being prematurely closed after queries that return inline results, which prevented re-execution, `getResultSet()`, and `getExecutionResult()` from working. Statements now remain open and reusable until explicitly closed by the caller.
18-
- Fixed primitive types within complex types (ARRAY, MAP, STRUCT) not being correctly parsed when Arrow serialization uses alternate formats: TIMESTAMP/TIMESTAMP_NTZ as epoch microseconds or component arrays, and BINARY as base64-encoded strings.
19-
- Fixed `PARSE_SYNTAX_ERROR` for column names containing special characters (e.g., dots) when `EnableBatchedInserts` is enabled, by re-quoting column names with backticks in reconstructed multi-row INSERT statements.
20-
- Fixed Volume ingestion for SEA mode, which was broken due to statement being closed prematurely.
21-
- Fixed unclear `error: [null]` messages during transient HTTP failures (e.g. 502 Bad Gateway) in Thrift polling. Error messages now include server error details and use SQL state `08S01` (communication link failure) so callers can identify retryable errors. Also fixed `DatabricksError` (RuntimeException) from SDK client being unhandled in CloudFetch download paths.
22-
- Fixed escaped pattern characters in catalogName for `getSchemas`, as returned catalogName should be unescaped.
23-
- Fixed `getColumnClassName()` returning null for VARIANT columns in SEA mode by adding VARIANT to the type system.
24-
- Fixed `getColumns()` returning `DATA_TYPE=0` (NULL) for GEOMETRY/GEOGRAPHY columns in Thrift mode. Now returns `Types.VARCHAR` (12) when geospatial is disabled and `Types.OTHER` (1111) when enabled, consistent with SEA mode.
25-
- Fixed `getCrossReference()` returning 0 rows when parent args are passed in uppercase. The client-side filter used case-sensitive comparison against server-returned lowercase names.
2610

2711
---
2812
*Note: When making changes, please add your change under the appropriate section

README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -20,7 +20,7 @@ Add the following dependency to your `pom.xml`:
2020
<dependency>
2121
<groupId>com.databricks</groupId>
2222
<artifactId>databricks-jdbc</artifactId>
23-
<version>3.3.2-SNAPSHOT</version>
23+
<version>3.3.2</version>
2424
</dependency>
2525
```
2626

assembly-thin/pom.xml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@
77
<parent>
88
<groupId>com.databricks</groupId>
99
<artifactId>databricks-jdbc-parent</artifactId>
10-
<version>3.3.2-SNAPSHOT</version>
10+
<version>3.3.2</version>
1111
</parent>
1212

1313
<artifactId>databricks-jdbc-thin</artifactId>
@@ -51,7 +51,7 @@
5151
<dependency>
5252
<groupId>com.databricks</groupId>
5353
<artifactId>databricks-jdbc-core</artifactId>
54-
<version>3.3.2-SNAPSHOT</version>
54+
<version>3.3.2</version>
5555
</dependency>
5656
</dependencies>
5757

assembly-uber/pom.xml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@
77
<parent>
88
<groupId>com.databricks</groupId>
99
<artifactId>databricks-jdbc-parent</artifactId>
10-
<version>3.3.2-SNAPSHOT</version>
10+
<version>3.3.2</version>
1111
</parent>
1212

1313
<artifactId>databricks-jdbc</artifactId>
@@ -51,7 +51,7 @@
5151
<dependency>
5252
<groupId>com.databricks</groupId>
5353
<artifactId>databricks-jdbc-core</artifactId>
54-
<version>3.3.2-SNAPSHOT</version>
54+
<version>3.3.2</version>
5555
</dependency>
5656
</dependencies>
5757

jdbc-core/pom.xml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@
66
<parent>
77
<groupId>com.databricks</groupId>
88
<artifactId>databricks-jdbc-parent</artifactId>
9-
<version>3.3.2-SNAPSHOT</version>
9+
<version>3.3.2</version>
1010
</parent>
1111

1212
<artifactId>databricks-jdbc-core</artifactId>

pom.xml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@
55
<groupId>com.databricks</groupId>
66
<artifactId>databricks-jdbc-parent</artifactId>
77
<!-- This value may be modified by a release script to reflect the current version of the driver. -->
8-
<version>3.3.2-SNAPSHOT</version>
8+
<version>3.3.2</version>
99
<packaging>pom</packaging>
1010
<name>Databricks JDBC Parent</name>
1111
<description>Parent POM for Databricks JDBC Driver.</description>
@@ -63,7 +63,7 @@
6363
<build-helper-maven-plugin.version>3.6.1</build-helper-maven-plugin.version>
6464

6565
<!-- Dependency Versions -->
66-
<databricks-jdbc-version>3.3.2-SNAPSHOT</databricks-jdbc-version>
66+
<databricks-jdbc-version>3.3.2</databricks-jdbc-version>
6767
<arrow.version>18.3.0</arrow.version>
6868
<commons-lang3.version>3.18.0</commons-lang3.version>
6969
<commons-configuration.version>2.10.1</commons-configuration.version>

release-notes.txt

Lines changed: 25 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -6,6 +6,31 @@ The release notes summarize enhancements, new features, known issues, workflow c
66

77
Version History ==============================================================
88

9+
3.3.2 ========================================================================
10+
Released 2026-04-27
11+
12+
Enhancements & New Features
13+
14+
* Added `CallableStatement` support with IN parameters. `Connection.prepareCall()` now returns a working `DatabricksCallableStatement` that supports positional parameter binding and execution via `{call proc(?)}` JDBC escape syntax. OUT/INOUT parameters and named parameters throw `SQLFeatureNotSupportedException`.
15+
* Added AI coding agent detection to the User-Agent header. When the driver is invoked by a known AI coding agent (e.g. Claude Code, Cursor, Gemini CLI), `agent/<product>` is appended to the User-Agent string.
16+
* Added support for using SQL SHOW commands for Thrift-mode metadata operations (`getTables`, `getColumns`, `getSchemas`, `getFunctions`, `getPrimaryKeys`, `getImportedKeys`, `getCrossReference`). Enable by setting `UseQueryForMetadata=1`. This aligns Thrift metadata behavior with Statement Execution API (SEA) mode.
17+
18+
Resolved Issues
19+
20+
* Improved error messages for cancelled statements: operations cancelled via `Statement.cancel()` or closed connections now return SQL state `HY008` (operation cancelled) instead of generic error codes.
21+
* Fixed race condition between chunk download error handling and result set close that could cause invalid state transition warnings (`CHUNK_RELEASED -> DOWNLOAD_FAILED`) during Arrow Cloud Fetch operations in resource-constrained environments.
22+
* Fixed `EnableBatchedInserts` silently falling back to individual execution when table or schema names contain special characters (e.g., hyphens) inside backtick-quoted identifiers. Added a warn log when the fallback occurs.
23+
* Fixed `IntervalConverter` crash (`IllegalArgumentException: Invalid interval metadata`) when INTERVAL columns are returned via CloudFetch.
24+
* Fixed `Statement` being prematurely closed after queries that return inline results, which prevented re-execution, `getResultSet()`, and `getExecutionResult()` from working.
25+
* Fixed primitive types within complex types (ARRAY, MAP, STRUCT) not being correctly parsed when Arrow serialization uses alternate formats: TIMESTAMP/TIMESTAMP_NTZ as epoch microseconds or component arrays, and BINARY as base64-encoded strings.
26+
* Fixed `PARSE_SYNTAX_ERROR` for column names containing special characters (e.g., dots) when `EnableBatchedInserts` is enabled.
27+
* Fixed Volume ingestion for SEA mode, which was broken due to statement being closed prematurely.
28+
* Fixed unclear `error: [null]` messages during transient HTTP failures (e.g. 502 Bad Gateway) in Thrift polling. Error messages now include server error details and use SQL state `08S01` (communication link failure) so callers can identify retryable errors. Also fixed `DatabricksError` (RuntimeException) from SDK client being unhandled in CloudFetch download paths.
29+
* Fixed escaped pattern characters in catalogName for `getSchemas`, as returned catalogName should be unescaped.
30+
* Fixed `getColumnClassName()` returning null for VARIANT columns in SEA mode by adding VARIANT to the type system.
31+
* Fixed `getColumns()` returning `DATA_TYPE=0` (NULL) for GEOMETRY/GEOGRAPHY columns in Thrift mode. Now returns `Types.VARCHAR` (12) when geospatial is disabled and `Types.OTHER` (1111) when enabled, consistent with SEA mode.
32+
* Fixed `getCrossReference()` returning 0 rows when parent args are passed in uppercase.
33+
934
3.3.1 ========================================================================
1035
Released 2026-03-17
1136

src/main/java/com/databricks/jdbc/common/util/DriverUtil.java

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -20,7 +20,7 @@
2020
public class DriverUtil {
2121

2222
private static final JdbcLogger LOGGER = JdbcLoggerFactory.getLogger(DriverUtil.class);
23-
private static final String DRIVER_VERSION = "3.3.2-SNAPSHOT";
23+
private static final String DRIVER_VERSION = "3.3.2";
2424
private static final String DRIVER_NAME = "oss-jdbc";
2525
private static final String JDBC_VERSION = "4.3";
2626

src/test/java/com/databricks/jdbc/api/impl/DatabricksDatabaseMetaDataTest.java

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -840,7 +840,7 @@ public void testGetDriverName() throws SQLException {
840840
@Test
841841
public void testGetDriverVersion() throws SQLException {
842842
String result = metaData.getDriverVersion();
843-
assertEquals("3.3.2-SNAPSHOT", result);
843+
assertEquals("3.3.2", result);
844844
}
845845

846846
@Test

0 commit comments

Comments
 (0)