Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
14 changes: 14 additions & 0 deletions common/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -56,6 +56,20 @@
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
</exclusion>
<!-- jsonschema2pojo 1.3.0 swapped its runtime to the Jackson 3 (tools.jackson.core) namespace;
we don't need it (we only consume the annotation API), and 3.0.2 ships several high-sev DoS CVEs. -->
<exclusion>
<groupId>tools.jackson.core</groupId>
<artifactId>jackson-databind</artifactId>
</exclusion>
<exclusion>
<groupId>tools.jackson.core</groupId>
<artifactId>jackson-core</artifactId>
</exclusion>
<exclusion>
<groupId>tools.jackson.core</groupId>
<artifactId>jackson-annotations</artifactId>
</exclusion>
<exclusion>
<groupId>com.google.code.gson</groupId>
<artifactId>gson</artifactId>
Expand Down
2 changes: 1 addition & 1 deletion openmetadata-k8s-operator/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@
<!-- Kubernetes client version (should match openmetadata-service) -->
<kubernetes-client.version>21.0.1</kubernetes-client.version>
<!-- Jackson version (should match parent project) -->
<jackson.version>2.17.2</jackson.version>
<jackson.version>2.21.2</jackson.version>
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Quality: k8s-operator jackson.version property redundant with parent BOM

The parent pom now imports jackson-bom in <dependencyManagement>, which governs versions for all com.fasterxml.jackson.* artifacts across child modules. However, openmetadata-k8s-operator/pom.xml still declares a local <jackson.version> property (line 26) and hardcodes <version>${jackson.version}</version> on its jackson-databind and jackson-dataformat-yaml dependencies (lines 48, 54). This creates a maintenance burden: the comment says "should match parent project" but requires manual sync. If someone bumps only the parent BOM, the k8s-operator will silently stay on the old version.

Consider removing the local jackson.version property and the explicit <version> tags on the Jackson dependencies — the parent's BOM import will resolve the correct versions automatically.

Suggested fix:

Remove the local jackson.version property (line 26) and
the <version>${jackson.version}</version> tags from the
jackson-databind (line 48) and jackson-dataformat-yaml
(line 54) dependency declarations. The parent's jackson-bom
import will resolve versions automatically.

Was this helpful? React with 👍 / 👎 | Reply gitar fix to apply this suggestion

</properties>
Comment on lines 23 to 27
Copy link

Copilot AI Apr 27, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This module defines its own <jackson.version> property and pins Jackson dependency versions directly. Since the parent POM now imports jackson-bom (and already defines ${jackson.version}), keeping a module-local override makes it easy for the operator to drift from the platform’s patched Jackson set. Consider deleting this local jackson.version property and letting the operator inherit the parent’s BOM-managed Jackson versions (i.e., omit explicit Jackson <version> tags in dependencies).

Copilot uses AI. Check for mistakes.

<dependencies>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -287,10 +287,11 @@ private Algorithm getAlgorithm(String algorithm, RSAPublicKey publicKey) {
case "RS256" -> Algorithm.RSA256(publicKey, null);
case "RS384" -> Algorithm.RSA384(publicKey, null);
case "RS512" -> Algorithm.RSA512(publicKey, null);
default -> throw new IllegalArgumentException(
"Unsupported token validation algorithm: "
+ algorithm
+ ". Supported: RS256, RS384, RS512");
default ->
throw new IllegalArgumentException(
"Unsupported token validation algorithm: "
+ algorithm
+ ". Supported: RS256, RS384, RS512");
};
}

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -360,8 +360,8 @@ private static List<String> getSupportedScopesForProvider() {
}
return switch (provider) {
case GOOGLE -> List.of("openid", "profile", "email");
case OKTA, AUTH_0, AWS_COGNITO, CUSTOM_OIDC -> List.of(
"openid", "profile", "email", "offline_access");
case OKTA, AUTH_0, AWS_COGNITO, CUSTOM_OIDC ->
List.of("openid", "profile", "email", "offline_access");
case AZURE -> List.of("openid", "profile", "email", "offline_access");
default -> List.of("openid", "profile", "email");
};
Expand Down
2 changes: 1 addition & 1 deletion openmetadata-service/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@
<jetty.version>12.1.7</jetty.version>
<logback-core.version>1.5.25</logback-core.version>
<logback-classic.version>1.5.25</logback-classic.version>
<resilience4j-ratelimiter.version>2.3.0</resilience4j-ratelimiter.version>
<resilience4j-ratelimiter.version>2.4.0</resilience4j-ratelimiter.version>
<kubernetes-client.version>24.0.0</kubernetes-client.version>
</properties>

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -647,31 +647,36 @@ private void validateExtension(
fieldValue =
parseEntityReferences(printer, csvRecord, fieldNumber, fieldValue.toString(), isList);
}
case "date-cp", "dateTime-cp", "time-cp" -> fieldValue =
parseFormattedDateTimeField(
printer,
csvRecord,
fieldNumber,
fieldName,
fieldValue.toString(),
customPropertyType,
propertyConfig);
case "enum" -> fieldValue =
parseEnumType(
printer,
csvRecord,
fieldNumber,
fieldName,
customPropertyType,
fieldValue,
propertyConfig);
case "timeInterval" -> fieldValue =
parseTimeInterval(printer, csvRecord, fieldNumber, fieldName, fieldValue);
case "number", "integer", "timestamp" -> fieldValue =
parseLongField(
printer, csvRecord, fieldNumber, fieldName, customPropertyType, fieldValue);
case "table-cp" -> fieldValue =
parseTableType(printer, csvRecord, fieldNumber, fieldName, fieldValue, propertyConfig);
case "date-cp", "dateTime-cp", "time-cp" ->
fieldValue =
parseFormattedDateTimeField(
printer,
csvRecord,
fieldNumber,
fieldName,
fieldValue.toString(),
customPropertyType,
propertyConfig);
case "enum" ->
fieldValue =
parseEnumType(
printer,
csvRecord,
fieldNumber,
fieldName,
customPropertyType,
fieldValue,
propertyConfig);
case "timeInterval" ->
fieldValue = parseTimeInterval(printer, csvRecord, fieldNumber, fieldName, fieldValue);
case "number", "integer", "timestamp" ->
fieldValue =
parseLongField(
printer, csvRecord, fieldNumber, fieldName, customPropertyType, fieldValue);
case "table-cp" ->
fieldValue =
parseTableType(
printer, csvRecord, fieldNumber, fieldName, fieldValue, propertyConfig);

default -> {}
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -947,8 +947,8 @@ private void registerAuthenticator(SecurityConfigurationManager catalogConfig) {
case BASIC -> authenticatorHandler = new BasicAuthenticator();
case LDAP -> authenticatorHandler = new LdapAuthenticator();
default ->
// For all other types, google, okta etc. auth is handled externally
authenticatorHandler = new NoopAuthenticator();
// For all other types, google, okta etc. auth is handled externally
authenticatorHandler = new NoopAuthenticator();
}
SecurityConfigurationManager.getInstance().setAuthenticatorHandler(authenticatorHandler);
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -315,14 +315,15 @@ private void notifyProgressListener(

switch (currentStatus) {
case COMPLETED -> progressListener.onJobCompleted(stats, elapsedMillis);
case COMPLETED_WITH_ERRORS -> progressListener.onJobCompletedWithErrors(
stats, elapsedMillis);
case FAILED -> progressListener.onJobFailed(
stats,
new RuntimeException(
job.getErrorMessage() != null
? job.getErrorMessage()
: "Distributed job failed"));
case COMPLETED_WITH_ERRORS ->
progressListener.onJobCompletedWithErrors(stats, elapsedMillis);
case FAILED ->
progressListener.onJobFailed(
stats,
new RuntimeException(
job.getErrorMessage() != null
? job.getErrorMessage()
: "Distributed job failed"));
case STOPPED -> progressListener.onJobStopped(stats);
default -> {
/* No special notification for other states */
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -617,12 +617,12 @@ public Optional<EventSubscriptionOffset> getEventSubscriptionOffset(UUID subscri

public int countTotalEvents(UUID id, TypedEvent.Status status) {
return switch (status) {
case FAILED -> Entity.getCollectionDAO()
.eventSubscriptionDAO()
.countFailedEventsById(id.toString());
case SUCCESSFUL -> Entity.getCollectionDAO()
.eventSubscriptionDAO()
.countSuccessfulEventsBySubscriptionId(id.toString());
case FAILED ->
Entity.getCollectionDAO().eventSubscriptionDAO().countFailedEventsById(id.toString());
case SUCCESSFUL ->
Entity.getCollectionDAO()
.eventSubscriptionDAO()
.countSuccessfulEventsBySubscriptionId(id.toString());
default -> throw new IllegalArgumentException("Unknown event status: " + status);
};
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -95,11 +95,14 @@ private String getKpiCriteriaMessage(MetricType metricType, KpiCriteria criteria
if (metricType != MetricType.TIER) {
if (kpiAvailable) {
return switch (criteria) {
case MET -> "Great the Target Set for KPIs has been achieved. It's time to restructure your goals, set new KPIs and progress faster.";
case IN_PROGRESS -> String.format(
"To meet the KPIs you will need a minimum of %s%% %s in the next %s days.",
targetKpi, getMetricTypeMessage(metricType).toLowerCase(), numberOfDaysLeft);
case NOT_MET -> "The Target set for KPIs was not met it’s time to restructure your goals and progress faster.";
case MET ->
"Great the Target Set for KPIs has been achieved. It's time to restructure your goals, set new KPIs and progress faster.";
case IN_PROGRESS ->
String.format(
"To meet the KPIs you will need a minimum of %s%% %s in the next %s days.",
targetKpi, getMetricTypeMessage(metricType).toLowerCase(), numberOfDaysLeft);
case NOT_MET ->
"The Target set for KPIs was not met it’s time to restructure your goals and progress faster.";
};
}
return "You have not set any KPIs yet, it’s time to restructure your goals, set KPIs and progress faster.";
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -38,11 +38,12 @@ public void execute(DelegateExecution execution) {
processInstanceId);
updateWorkflowInstance(execution, workflowInstanceRepository);
}
default -> LOG.debug(
"[WORKFLOW_INSTANCE_EVENT] Workflow: {}, ProcessInstance: {} - Unsupported event: {}",
workflowName,
processInstanceId,
eventName);
default ->
LOG.debug(
"[WORKFLOW_INSTANCE_EVENT] Workflow: {}, ProcessInstance: {} - Unsupported event: {}",
workflowName,
processInstanceId,
eventName);
}
} catch (Exception exc) {
LOG.error(
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -45,12 +45,13 @@ public void execute(DelegateExecution execution) {
currentActivity);
updateStage(varHandler, execution, workflowInstanceStateRepository);
}
default -> LOG.debug(
"[STAGE_EVENT] Workflow: {}, ProcessInstance: {}, Activity: {} - Unsupported event: {}",
workflowName,
processInstanceId,
currentActivity,
eventName);
default ->
LOG.debug(
"[STAGE_EVENT] Workflow: {}, ProcessInstance: {}, Activity: {} - Unsupported event: {}",
workflowName,
processInstanceId,
currentActivity,
eventName);
}
} catch (Exception exc) {
LOG.error(
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -50,39 +50,49 @@ public static NodeInterface createNode(
return switch (NodeSubType.fromValue(nodeDefinition.getSubType())) {
case START_EVENT -> new StartEvent((StartEventDefinition) nodeDefinition, config);
case END_EVENT -> new EndEvent((EndEventDefinition) nodeDefinition, config);
case CHECK_ENTITY_ATTRIBUTES_TASK -> new CheckEntityAttributesTask(
(CheckEntityAttributesTaskDefinition) nodeDefinition, config);
case CHECK_CHANGE_DESCRIPTION_TASK -> new CheckChangeDescriptionTask(
(CheckChangeDescriptionTaskDefinition) nodeDefinition, config);
case SET_ENTITY_ATTRIBUTE_TASK -> new SetEntityAttributeTask(
(SetEntityAttributeTaskDefinition) nodeDefinition, config);
case SET_ENTITY_CERTIFICATION_TASK -> new SetEntityCertificationTask(
(SetEntityCertificationTaskDefinition) nodeDefinition, config);
case SET_GLOSSARY_TERM_STATUS_TASK -> new SetGlossaryTermStatusTask(
(SetGlossaryTermStatusTaskDefinition) nodeDefinition, config);
case USER_APPROVAL_TASK -> new UserApprovalTask(
(UserApprovalTaskDefinition) nodeDefinition,
config,
resolveUserApprovalTaskType(workflowDefinitionName),
resolveUserApprovalTaskCategory(workflowDefinitionName));
case CREATE_AND_RUN_INGESTION_PIPELINE_TASK -> new CreateAndRunIngestionPipelineTask(
(CreateAndRunIngestionPipelineTaskDefinition) nodeDefinition, config);
case CHECK_ENTITY_ATTRIBUTES_TASK ->
new CheckEntityAttributesTask(
(CheckEntityAttributesTaskDefinition) nodeDefinition, config);
case CHECK_CHANGE_DESCRIPTION_TASK ->
new CheckChangeDescriptionTask(
(CheckChangeDescriptionTaskDefinition) nodeDefinition, config);
case SET_ENTITY_ATTRIBUTE_TASK ->
new SetEntityAttributeTask((SetEntityAttributeTaskDefinition) nodeDefinition, config);
case SET_ENTITY_CERTIFICATION_TASK ->
new SetEntityCertificationTask(
(SetEntityCertificationTaskDefinition) nodeDefinition, config);
case SET_GLOSSARY_TERM_STATUS_TASK ->
new SetGlossaryTermStatusTask(
(SetGlossaryTermStatusTaskDefinition) nodeDefinition, config);
case USER_APPROVAL_TASK ->
new UserApprovalTask(
(UserApprovalTaskDefinition) nodeDefinition,
config,
resolveUserApprovalTaskType(workflowDefinitionName),
resolveUserApprovalTaskCategory(workflowDefinitionName));
case CREATE_AND_RUN_INGESTION_PIPELINE_TASK ->
new CreateAndRunIngestionPipelineTask(
(CreateAndRunIngestionPipelineTaskDefinition) nodeDefinition, config);
case RUN_APP_TASK -> new RunAppTask((RunAppTaskDefinition) nodeDefinition, config);
case ROLLBACK_ENTITY_TASK -> new RollbackEntityTask(
(RollbackEntityTaskDefinition) nodeDefinition, config);
case DATA_COMPLETENESS_TASK -> new DataCompletenessTask(
(DataCompletenessTaskDefinition) nodeDefinition, config);
case PARALLEL_GATEWAY -> new ParallelGateway(
(ParallelGatewayDefinition) nodeDefinition, config);
case ROLLBACK_ENTITY_TASK ->
new RollbackEntityTask((RollbackEntityTaskDefinition) nodeDefinition, config);
case DATA_COMPLETENESS_TASK ->
new DataCompletenessTask((DataCompletenessTaskDefinition) nodeDefinition, config);
case PARALLEL_GATEWAY ->
new ParallelGateway((ParallelGatewayDefinition) nodeDefinition, config);
case SINK_TASK -> new SinkTask((SinkTaskDefinition) nodeDefinition, config);
case CREATE_RECOGNIZER_FEEDBACK_APPROVAL_TASK -> new CreateRecognizerFeedbackApprovalTask(
(CreateRecognizerFeedbackApprovalTaskDefinition) nodeDefinition, config);
case APPLY_RECOGNIZER_FEEDBACK_TASK -> new ApplyRecognizerFeedbackTask(
(ApplyRecognizerFeedbackTaskDefinition) nodeDefinition, config);
case REJECT_RECOGNIZER_FEEDBACK_TASK -> new RejectRecognizerFeedbackTask(
(RejectRecognizerFeedbackTaskDefinition) nodeDefinition, config);
default -> throw new IllegalArgumentException(
"Unsupported node subtype: " + nodeDefinition.getSubType());
case CREATE_RECOGNIZER_FEEDBACK_APPROVAL_TASK ->
new CreateRecognizerFeedbackApprovalTask(
(CreateRecognizerFeedbackApprovalTaskDefinition) nodeDefinition, config);
case APPLY_RECOGNIZER_FEEDBACK_TASK ->
new ApplyRecognizerFeedbackTask(
(ApplyRecognizerFeedbackTaskDefinition) nodeDefinition, config);
case REJECT_RECOGNIZER_FEEDBACK_TASK ->
new RejectRecognizerFeedbackTask(
(RejectRecognizerFeedbackTaskDefinition) nodeDefinition, config);
default ->
throw new IllegalArgumentException(
"Unsupported node subtype: " + nodeDefinition.getSubType());
};
}

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -19,17 +19,20 @@ public static TriggerInterface createTrigger(WorkflowDefinition workflow) {
String triggerWorkflowId = getTriggerWorkflowId(workflow.getFullyQualifiedName());

return switch (TriggerType.fromValue(workflow.getTrigger().getType())) {
case EVENT_BASED_ENTITY -> new EventBasedEntityTrigger(
workflow.getName(),
triggerWorkflowId,
(EventBasedEntityTriggerDefinition) workflow.getTrigger());
case NO_OP -> new NoOpTrigger(
workflow.getName(), triggerWorkflowId, (NoOpTriggerDefinition) workflow.getTrigger());
case PERIODIC_BATCH_ENTITY -> new PeriodicBatchEntityTrigger(
workflow.getName(),
triggerWorkflowId,
(PeriodicBatchEntityTriggerDefinition) workflow.getTrigger(),
hasBatchModeNodes(workflow));
case EVENT_BASED_ENTITY ->
new EventBasedEntityTrigger(
workflow.getName(),
triggerWorkflowId,
(EventBasedEntityTriggerDefinition) workflow.getTrigger());
case NO_OP ->
new NoOpTrigger(
workflow.getName(), triggerWorkflowId, (NoOpTriggerDefinition) workflow.getTrigger());
case PERIODIC_BATCH_ENTITY ->
new PeriodicBatchEntityTrigger(
workflow.getName(),
triggerWorkflowId,
(PeriodicBatchEntityTriggerDefinition) workflow.getTrigger(),
hasBatchModeNodes(workflow));
};
}

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -114,17 +114,20 @@ private Map<String, Object> getConfig(App app, ServiceEntityInterface service) {
Object config = JsonUtils.deepCopy(app.getAppConfiguration(), Object.class);

switch (app.getName()) {
case "CollateAIApplication" -> config =
(JsonUtils.convertValue(config, CollateAIAppConfig.class))
.withFilter(getTableServiceFilter(service.getName()))
.withPatchIfEmpty(true);
case "CollateAIQualityAgentApplication" -> config =
(JsonUtils.convertValue(config, CollateAIQualityAgentAppConfig.class))
.withFilter(getTableServiceFilter(service.getName()));
case "CollateAITierAgentApplication" -> config =
(JsonUtils.convertValue(config, CollateAITierAgentAppConfig.class))
.withFilter(getTableServiceFilter(service.getName()))
.withPatchIfEmpty(true);
case "CollateAIApplication" ->
config =
(JsonUtils.convertValue(config, CollateAIAppConfig.class))
.withFilter(getTableServiceFilter(service.getName()))
.withPatchIfEmpty(true);
case "CollateAIQualityAgentApplication" ->
config =
(JsonUtils.convertValue(config, CollateAIQualityAgentAppConfig.class))
.withFilter(getTableServiceFilter(service.getName()));
case "CollateAITierAgentApplication" ->
config =
(JsonUtils.convertValue(config, CollateAITierAgentAppConfig.class))
.withFilter(getTableServiceFilter(service.getName()))
.withPatchIfEmpty(true);
case "DataInsightsApplication" -> {
DataInsightsAppConfig updatedAppConfig =
(JsonUtils.convertValue(config, DataInsightsAppConfig.class));
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@ public static String extractFqnFromValue(Object value) {
return null;
}

// FieldChange values are often stored as JSON strings, try to parse first
// FieldChange values are often stored as JSON strings, try to parse first
case String strValue -> {
String trimmed = strValue.trim();
if (trimmed.startsWith("[") || trimmed.startsWith("{")) {
Expand Down
Loading
Loading