Skip to content

[#9710] feat(flink-connector): add Flink 1.19 and 1.20 support#11071

Open
FANNG1 wants to merge 1 commit into
apache:mainfrom
FANNG1:flink-1-19-1-20-support
Open

[#9710] feat(flink-connector): add Flink 1.19 and 1.20 support#11071
FANNG1 wants to merge 1 commit into
apache:mainfrom
FANNG1:flink-1-19-1-20-support

Conversation

@FANNG1
Copy link
Copy Markdown
Contributor

@FANNG1 FANNG1 commented May 13, 2026

What changes were proposed in this pull request?

This PR adds Gravitino Flink connector support for Flink 1.19 and Flink 1.20
on top of the versioned-layout baseline established for Flink 1.18 (#10517).

Main changes:

  • Add flink-connector/v1.19/flink + flink-connector/v1.19/flink-runtime
    modules.
  • Add flink-connector/v1.20/flink + flink-connector/v1.20/flink-runtime
    modules.
  • Add version-specific catalog/factory entry classes (*Flink119 /
    *Flink120) for Hive, Iceberg, JDBC-MySQL, JDBC-Postgres, and Paimon.
  • Add typed CatalogCompatFlink119 (behaviour-identical to the default
    hook) and CatalogCompatFlink120 that uses CatalogTable.newBuilder(...)
    because CatalogTable.of(...) is deprecated in Flink 1.20.
  • The JDBC catalog for 1.19 / 1.20 uses the relocated
    org.apache.flink.connector.jdbc.core.* factory packages shipped in
    flink-connector-jdbc 3.3.0-1.19 / 3.3.0-1.20.
  • Add provider-specific integration-test entry classes for each new minor
    version.
  • Add a Flink 1.20-specific override of
    testCreateGravitinoHiveCatalogRequireOptions — Flink 1.20 reports
    missing required catalog options as IllegalArgumentException rather
    than the ValidationException thrown by older minor versions.
  • Extend the Flink CI job to run embedded and deploy mode integration
    tests across 1.18, 1.19, and 1.20.

No flink-common changes are required; the 1.18 baseline already exposes
the necessary extension hooks (BaseCatalog#catalogCompat(), the protected
GravitinoJdbcCatalog(Context, ..., AbstractCatalog) constructor, etc.).

This PR supersedes #11061 (closed) and the now-stale #10501.

Why are the changes needed?

The Flink 1.18 baseline (#10517) was intentionally limited to one minor and
deferred 1.19 / 1.20 to follow-up work under parent issue #9710. This PR
is that follow-up, so users can target the two currently-maintained Flink
minor versions.

Fix: #9710

Does this PR introduce any user-facing change?

Yes. Two new connector artifacts are produced:

  • gravitino-flink-connector-runtime-1.19_2.12
  • gravitino-flink-connector-runtime-1.20_2.12

The user-facing SQL syntax is unchanged.

How was this patch tested?

Local validations (all green):

  • ./gradlew spotlessApply
  • ./gradlew :flink-connector:flink-1.19:test -PskipITs
  • ./gradlew :flink-connector:flink-1.20:test -PskipITs
  • ./gradlew :flink-connector:flink-runtime-1.19:test
  • ./gradlew :flink-connector:flink-runtime-1.20:test
  • ./gradlew :flink-connector:flink-runtime-1.19:shadowJar :flink-connector:flink-runtime-1.20:shadowJar

The runtime-jar tests verify that the version-specific SPI descriptors
(META-INF/services/org.apache.flink.table.factories.Factory) are merged
into the shaded uber-jar.

The full Hive / Iceberg / JDBC / Paimon integration tests for both new
minor versions will run in CI via the updated
.github/workflows/flink-integration-test-action.yml.

@geyanggang geyanggang self-requested a review May 13, 2026 08:14
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[Improvement] support multi version for Gravitino Flink connector

1 participant