Releases: datacontract/datacontract-cli
v0.12.2
[0.12.2] - 2026-05-05
Added
impalaextra (pip install datacontract-cli[impala]) — pulls insoda-core-impala. Impala engine support landed in #965 but the install extra was never added; users had to installsoda-core-impalamanually. Also included in[all].
Changed
- breaking: drop the
dbtextra and thedbt-coredependency.import dbtnow readsmanifest.jsondirectly with no third-party dependency, and works without installing any extra. Minimum supported manifest schema version is v9 (dbt 1.5+). Users who installeddatacontract-cli[dbt]should switch to plaindatacontract-cli. - breaking: the
protobufextra now requires theprotoccompiler installed on the system. Replaces the bundledgrpcio-tools(~50 MB of platform-specific protoc binaries) with the lighterprotobufruntime (>=3.20,<7.0).import protobufraises a clear error with platform-specific install hints ifprotocis not onPATH. Install withbrew install protobuf(macOS),sudo apt install protobuf-compiler(Debian/Ubuntu), etc. — see README.
Fixed
- README install table: add missing
csv,excel, andoracleextras. The matching[project.optional-dependencies]entries already existed but were undocumented. - quality: support
{object}and${object}placeholder in SQL quality queries as the ODCS-spec name for the current schema object (alias for{model}/{table}) (#676) changelogcommand help text now advertises(url or path)for V1/V2 arguments, clarifying that HTTP/HTTPS URLs are accepted (#1162)- breaking:
testcommand now exits non-zero when a server is specified, but soda-core fails to connect or authenticate (#1181) - correct swapped
check_typelabelsmodel_qualty_sqlandfield_quality_sql(#1187) import sparknow emits a native Spark SQL physicalType (e.g.string) instead of Python repr (e.g.StringType()). Contracts imported using Spark in v0.11.0–v0.12.1 did not perform type checks and must be re-imported. (#1048)- Re-add
setuptoolsas a base dependency. soda-core'senv_helper.pyimportsfrom distutils.util import strtobool;distutilswas removed from stdlib in Python 3.12 and stripped from python-build-standalone 3.11 builds.setuptoolsprovides thedistutilsshim. Previously pulled in transitively viagrpcio-tools; now required explicitly. Reverts #1199 — see soda-core#2091. - SLA freshness checks now quote column identifiers with special characters (#1202)
- update field / model quotation for Impala, dataframe, and Kafka (#1202)
v0.12.1
[0.12.1] - 2026-04-21
This release introduces several changes to improve the usability of datacontract-cli for AI Agents.
- Breaking: Several changes in the CLI syntax (#1157):
Fix in v0.12.1: re-added
--schemaas alias for the new--json-schema(will be removed in v0.13.0)
| Command | Old option | New option |
|---|---|---|
lint, test, ci, publish, catalog |
--schema <PATH> (still works until v0.13.0) |
--json-schema <PATH> |
export, import |
--format <FORMAT> <OPTIONS> |
<FORMAT> <OPTIONS> (drop --format) |
| Export options: | ||
export --format dbt |
--format dbt |
dbt-models (format renamed) |
export --format great-expectations |
--sql-server-type <TYPE> |
--dialect <TYPE> |
export --format rdf |
--rdf-base <URI> |
--base <URI> |
export --format sql |
--sql-server-type <TYPE> |
--dialect <TYPE> |
export --format sql-query |
--sql-server-type <TYPE> |
--dialect <TYPE> |
| Import options: | ||
import --format bigquery |
--bigquery-[project|dataset|table] <NAME> |
--[project|dataset|table] <NAME> |
import --format dbt |
--dbt-model <NAME> |
--model <NAME> |
import --format glue |
--source <NAME>, --glue-table <NAME> |
--database <NAME>, --table <NAME> |
import --format iceberg |
--iceberg-table <NAME> |
--table <NAME> |
import --format unity |
--unity-table-full-name <NAME> |
--table <NAME> |
import --format spark |
--source <NAMES> |
--tables <NAMES> |
import |
--template |
dropped (was a no-op) |
The --schema option (referring to the ODCS JSON schema) was renamed to --json-schema to avoid confusion with --schema-name, which refers to the schema within the data contract to test for.
- Error messages for uncaught exceptions are shortened now. Pass
--debug(or setDATACONTRACT_CLI_DEBUG=1) to see the full traceback. (#1175) - Add example calls to
--helpoutputs (#1176) - Add explicit errors when required env vars for soda connections are missing (#1177)
- Validate some of the CLI options against their allowed values instead of accepting any string (#1178)
v0.12.0
[0.12.0] - 2026-04-20
This release introduces several changes to improve the usability of datacontract-cli for AI Agents.
- Breaking: Several changes in the CLI syntax (#1157):
Fix in v0.12.1: re-added
--schemaas alias for the new--json-schema(will be removed in v0.13.0)
| Command | Old option | New option |
|---|---|---|
lint, test, ci, publish, catalog |
--schema <PATH> (still works until v0.13.0) |
--json-schema <PATH> |
export, import |
--format <FORMAT> <OPTIONS> |
<FORMAT> <OPTIONS> (drop --format) |
| Export options: | ||
export --format dbt |
--format dbt |
dbt-models (format renamed) |
export --format great-expectations |
--sql-server-type <TYPE> |
--dialect <TYPE> |
export --format rdf |
--rdf-base <URI> |
--base <URI> |
export --format sql |
--sql-server-type <TYPE> |
--dialect <TYPE> |
export --format sql-query |
--sql-server-type <TYPE> |
--dialect <TYPE> |
| Import options: | ||
import --format bigquery |
--bigquery-[project|dataset|table] <NAME> |
--[project|dataset|table] <NAME> |
import --format dbt |
--dbt-model <NAME> |
--model <NAME> |
import --format glue |
--source <NAME>, --glue-table <NAME> |
--database <NAME>, --table <NAME> |
import --format iceberg |
--iceberg-table <NAME> |
--table <NAME> |
import --format unity |
--unity-table-full-name <NAME> |
--table <NAME> |
import --format spark |
--source <NAMES> |
--tables <NAMES> |
import |
--template |
dropped (was a no-op) |
The --schema option (referring to the ODCS JSON schema) was renamed to --json-schema to avoid confusion with --schema-name, which refers to the schema within the data contract to test for.
- Error messages for uncaught exceptions are shortened now. Pass
--debug(or setDATACONTRACT_CLI_DEBUG=1) to see the full traceback. (#1175) - Add example calls to
--helpoutputs (#1176) - Add explicit errors when required env vars for soda connections are missing (#1177)
- Validate some of the CLI options against their allowed values instead of accepting any string (#1178)
v0.11.9
[0.11.9] - 2026-04-20
Added
- Added
--checksoption totestcommand to selectively run check categories:schema,quality,servicelevel(#678) - Added
--schema-nameoption totestcommand to test a specific schema instead of all schemas (#1079,#1085 @kelsoufi-sanofi)
Fixed
- Move
precision/scalefornumbertypes fromlogicalTypeOptionstocustomProperties(#1145,#1160 @davidb-tada) - Emit placeholder server values in SQL importer so generated contracts pass lint (#1146,#1152 @Ai-chan-0411)
- Fix Protobuf export for arrays of objects and improve message/enum naming to UpperCamelCase (#1012 @Schokuroff)
- Exit with code 1 when
--servername is not found (#1153,#1161 @Ai-chan-0411)
Thanks to @kelsoufi-sanofi for the new --schema-name option on test, and to @Schokuroff, @Ai-chan-0411, and @davidb-tada for their contributions.
v0.11.8
[0.11.8] - 2026-04-10
Added
- Added
cicommand for CI/CD-optimized test runs: multi-file support, GitHub Actions annotations and step summary, Azure DevOps annotations,--fail-onflag,--jsonoutput (#1114) - Added
changelogcommand and API endpoint (#1118 @davidb-tada) - Added opt-in
--all-errorsmode fordatacontract lintto report all JSON Schema validation errors, with matchingall_errorssupport in the Python library and API (#1125 @jmbenedetto) - Added
--schema-nameoption to custom model export (#978 @AntoineGiraud)
Fixed
- Avro importer now raises an error for union fields with multiple non-null types, which are not supported by ODCS (#1124)
- Fix SQL export generating multiple PRIMARY KEY constraints for composite keys (#1026,#1092 @barry0451 @dwestheide)
- Preserve parametrized physicalTypes for SQL export (#1086,#1093 @barry0451 @alexander-griesbeck)
- Fix incorrect SQL type mappings: SQL Server
double/jsonb, MySQL barevarchar, missing Trino types (#1110) - Fix markdown export breaking table structure when extra field values contain pipe characters (#832,#1117 @barry0451 @grepwood)
- Fix dbt import using incorrect physicalType instead of actual materialization type (#1136)
- Remove unnecessary numpy dependency from databricks and kafka extras (#1135 @kayhendriksen)
Special thanks to @davidb-tada for the outstanding contribution of the new changelog command and API endpoint! Also thanks to @barry0451 for multiple quality fixes across the SQL exporter and markdown export, and to @AntoineGiraud and @jmbenedetto for their feature contributions.
v0.11.7
[0.11.7] - 2026-03-24
Fixed
- Escape single quotes in string values for SodaCL checks (#1090)
- Escape BigQuery field and model names with backticks for SodaCL checks (#736)
- Escape Databricks model names with backticks for SodaCL checks
- Fixed catalog export SpecView not having a tags property for the index.html template (#1059)
- Fix SQL importer type mappings: binary types, datetime/time, uuid now map to correct ODCS logicalType and format (#790)
Added
Release v0.11.6
[0.11.6] - 2026-03-17
Fixed
- Fix parser error for CSV / Parquet table names containing special characters (#1066)
- Fix BigQuery export failing with "Unsupported type" for parameterized physicalType like
NUMERIC(18, 4)(#1083)
Added
- Added JSON output format for test results (
--output-format json) - Added Azure AD / Entra ID authentication support for SQL Server and Microsoft Fabric
Release v0.11.5
[0.11.5] - 2026-02-19
Fixed
- Fix BigQuery import for repeated fields (#1017)
- Make Markdown export compatible with XHTML by replacing
<br>with<br />(#1030) - Add ADC/WIF and impersonation support for BigQuery (#1064)
- Fix Snowflake quoted identifiers by enabling double-quote quoting (#1053)
- Fix retention duration crash for numeric ODCS values (#1051)
- Fix physicalType bypass for precision and scale conversion (#1043)
- Fix mkdir TOCTOU race causing silent JUnit write failure (#1050)
- Fix validation failure for field names with special chars on Databricks (#1049)
- Add Azure support for field name quoting in schema checks (#1025)
v0.11.4
[0.11.4] - 2026-01-19
Changed
- Made
duckdban optional dependency. Install withpip install datacontract-cli[duckdb]for local/S3/GCS/Azure file testing. - Removed unused
fastparquetandnumpycore dependencies.
Added
- Include searchable tags in catalog index.html
Fixed
v0.11.3
Fixed
- Fix
datacontract initto generate ODCS format instead of deprecated Data Contract Specification (#984) - Fix ODCS lint failing on optional relationship
typefield by updating open-data-contract-standard to v3.1.2 (#971) - Restrict DuckDB dependency to < 1.4.0 (#972)
- Fixed schema evolution support for optional fields in CSV and Parquet formats. Optional fields marked with
required: falseare no longer incorrectly treated as required during validation, enabling proper schema evolution where optional fields can be added to contracts without breaking validation of historical data files (#977) - Fixed decimals in pydantic model export. Fields marked with
type: decimalwill be mapped todecimal.Decimalinstead offloat. - Fix BigQuery test failure for fields with FLOAT or BOOLEAN types by mapping them to equivalent types (BOOL and FLOAT64)