Skip to content

DAT-19308 - feat(databricks): ensure essential table properties Delta.* #648

DAT-19308 - feat(databricks): ensure essential table properties Delta.*

DAT-19308 - feat(databricks): ensure essential table properties Delta.* #648

GitHub Actions / Liquibase Test Harness - Contributed Reports succeeded Jan 10, 2025 in 0s

49 passed, 2 failed and 28 skipped

Tests failed

Report Passed Failed Skipped Time
target/surefire-reports/TEST-liquibase.ext.databricks.ContributedExtensionHarnessTestSuite.xml 1849s
target/surefire-reports/TEST-liquibase.harness.change.ChangeObjectTests.xml 44✅ 2❌ 28⚪ 1719s
target/surefire-reports/TEST-liquibase.harness.data.ChangeDataTests.xml 5✅ 130s

✅ target/surefire-reports/TEST-liquibase.ext.databricks.ContributedExtensionHarnessTestSuite.xml

No tests found

Test suite Passed Failed Skipped Time
liquibase.ext.databricks.ContributedExtensionHarnessTestSuite 1849s

❌ target/surefire-reports/TEST-liquibase.harness.change.ChangeObjectTests.xml

74 tests were completed in 1719s with 44 passed, 2 failed and 28 skipped.

Test suite Passed Failed Skipped Time
liquibase.harness.change.ChangeObjectTests 44✅ 2❌ 28⚪ 1719s

❌ liquibase.harness.change.ChangeObjectTests

✅ apply 1initScript against databricks 3.1
⚪ apply addAutoIncrement against databricks 3.1
✅ apply addCheckConstraint against databricks 3.1
✅ apply addColumn against databricks 3.1
✅ apply addDefaultValue against databricks 3.1
✅ apply addDefaultValueBoolean against databricks 3.1
✅ apply addDefaultValueComputed against databricks 3.1
✅ apply addDefaultValueDate against databricks 3.1
⚪ apply addDefaultValueNumeric against databricks 3.1
⚪ apply addDefaultValueSequenceNext against databricks 3.1
✅ apply addForeignKey against databricks 3.1
✅ apply addLookupTable against databricks 3.1
✅ apply addNotNullConstraint against databricks 3.1
✅ apply addPrimaryKey against databricks 3.1
⚪ apply addUniqueConstraint against databricks 3.1
✅ apply alterCluster against databricks 3.1
⚪ apply alterSequence against databricks 3.1
✅ apply alterTableProperties against databricks 3.1
✅ apply alterViewProperties against databricks 3.1
✅ apply analyzeTable against databricks 3.1
✅ apply createClusteredTable against databricks 3.1
✅ apply createClusteredTableNew against databricks 3.1
✅ apply createComplexTypesTable against databricks 3.1
✅ apply createExternalCsvTable against databricks 3.1
⚪ apply createFunction against databricks 3.1
⚪ apply createIndex against databricks 3.1
⚪ apply createPackage against databricks 3.1
⚪ apply createPackageBody against databricks 3.1
❌ apply createPartitionedTable against databricks 3.1
	org.opentest4j.AssertionFailedError:
⚪ apply createProcedure against databricks 3.1
⚪ apply createProcedureFromFile against databricks 3.1
⚪ apply createSequence against databricks 3.1
❌ apply createTable against databricks 3.1
	org.opentest4j.AssertionFailedError:
✅ apply createTableDataTypeText against databricks 3.1
✅ apply createTableTimestamp against databricks 3.1
✅ apply createTableWithDefaultValues against databricks 3.1
✅ apply createTableWithTwoIdentityColumns against databricks 3.1
⚪ apply createTrigger against databricks 3.1
✅ apply createView against databricks 3.1
⚪ apply disableCheckConstraint against databricks 3.1
⚪ apply disableTrigger against databricks 3.1
✅ apply dropAllForeignKeyConstraints against databricks 3.1
✅ apply dropCheckConstraint against databricks 3.1
✅ apply dropColumn against databricks 3.1
⚪ apply dropDefaultValue against databricks 3.1
✅ apply dropForeignKey against databricks 3.1
⚪ apply dropFunction against databricks 3.1
⚪ apply dropIndex against databricks 3.1
✅ apply dropNotNullConstraint against databricks 3.1
✅ apply dropPrimaryKey against databricks 3.1
⚪ apply dropProcedure against databricks 3.1
⚪ apply dropSequence against databricks 3.1
✅ apply dropTable against databricks 3.1
⚪ apply dropTrigger against databricks 3.1
⚪ apply dropUniqueConstraint against databricks 3.1
✅ apply dropView against databricks 3.1
⚪ apply enableCheckConstraint against databricks 3.1
⚪ apply enableTrigger against databricks 3.1
✅ apply executeCommand against databricks 3.1
✅ apply mergeColumns against databricks 3.1
⚪ apply modifyDataType against databricks 3.1
✅ apply modifySql against databricks 3.1
✅ apply optimizeTable against databricks 3.1
✅ apply renameColumn against databricks 3.1
⚪ apply renameSequence against databricks 3.1
⚪ apply renameTable against databricks 3.1
⚪ apply renameTrigger against databricks 3.1
✅ apply renameView against databricks 3.1
✅ apply setColumnRemarks against databricks 3.1
✅ apply setTableRemarks against databricks 3.1
✅ apply sql against databricks 3.1
✅ apply sqlFile against databricks 3.1
✅ apply vacuumTable against databricks 3.1
✅ apply #testInput.changeObject against #testInput.databaseName #testInput.version

✅ target/surefire-reports/TEST-liquibase.harness.data.ChangeDataTests.xml

5 tests were completed in 130s with 5 passed, 0 failed and 0 skipped.

Test suite Passed Failed Skipped Time
liquibase.harness.data.ChangeDataTests 5✅ 130s

✅ liquibase.harness.data.ChangeDataTests

✅ apply delete against databricks 3.1
✅ apply insert against databricks 3.1
✅ apply loadData against databricks 3.1
✅ apply loadUpdateData against databricks 3.1
✅ apply #testInput.changeData against #testInput.databaseName #testInput.version

Annotations

Check failure on line 0 in target/surefire-reports/TEST-liquibase.harness.change.ChangeObjectTests.xml

See this annotation in the file changed.

@github-actions github-actions / Liquibase Test Harness - Contributed Reports

liquibase.harness.change.ChangeObjectTests ► apply createPartitionedTable against databricks 3.1

Failed test found in:
  target/surefire-reports/TEST-liquibase.harness.change.ChangeObjectTests.xml
Error:
  org.opentest4j.AssertionFailedError: 
Raw output
org.opentest4j.AssertionFailedError: 
liquibase.exception.LiquibaseException: liquibase.exception.MigrationFailedException: Migration failed for changeset liquibase/harness/change/changelogs/databricks/createPartitionedTable.xml::2::mykhailo:
     Reason: liquibase.exception.DatabaseException: [Databricks][JDBCDriver](500051) ERROR processing query/statement. Error Code: 0, SQL state: 42KD7, Query: CREATE TAB***, Error message from Server: org.apache.hive.service.cli.HiveSQLException: Error running query: [DELTA_CREATE_TABLE_WITH_DIFFERENT_PROPERTY] com.databricks.sql.transaction.tahoe.DeltaAnalysisException: [DELTA_CREATE_TABLE_WITH_DIFFERENT_PROPERTY] The specified properties do not match the existing properties at s3://databricks-th/partitioned_delta_table.

== Specified ==
delta.columnMapping.mode=name
delta.enableDeletionVectors=true
delta.feature.allowcolumndefaults=supported
this.is.my.key=12
this.is.my.key2=true

== Existing ==
delta.enableDeletionVectors=true
this.is.my.key=12
this.is.my.key2=true

	at org.apache.spark.sql.hive.thriftserver.HiveThriftServerErrors$.runningQueryError(HiveThriftServerErrors.scala:49)
	at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.$anonfun$execute$1(SparkExecuteStatementOperation.scala:805)
	at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
	at com.databricks.unity.UCSEphemeralState$Handle.runWith(UCSEphemeralState.scala:51)
	at com.databricks.unity.HandleImpl.runWith(UCSHandle.scala:104)
	at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.org$apache$spark$sql$hive$thriftserver$SparkExecuteStatementOperation$$execute(SparkExecuteStatementOperation.scala:641)
	at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2$$anon$3.$anonfun$run$5(SparkExecuteStatementOperation.scala:486)
	at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
	at org.apache.spark.sql.execution.SQLExecution$.withRootExecution(SQLExecution.scala:711)
	at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2$$anon$3.$anonfun$run$2(SparkExecuteStatementOperation.scala:486)
	at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
	at com.databricks.logging.AttributionContextTracing.$anonfun$withAttributionContext$1(AttributionContextTracing.scala:48)
	at com.databricks.logging.AttributionContext$.$anonfun$withValue$1(AttributionContext.scala:276)
	at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)
	at com.databricks.logging.AttributionContext$.withValue(AttributionContext.scala:272)
	at com.databricks.logging.AttributionContextTracing.withAttributionContext(AttributionContextTracing.scala:46)
	at com.databricks.logging.AttributionContextTracing.withAttributionContext$(AttributionContextTracing.scala:43)
	at com.databricks.spark.util.PublicDBLogging.withAttributionContext(DatabricksSparkUsageLogger.scala:29)
	at com.databricks.logging.AttributionContextTracing.withAttributionTags(AttributionContextTracing.scala:95)
	at com.databricks.logging.AttributionContextTracing.withAttributionTags$(AttributionContextTracing.scala:76)
	at com.databricks.spark.util.PublicDBLogging.withAttributionTags(DatabricksSparkUsageLogger.scala:29)
	at com.databricks.spark.util.PublicDBLogging.withAttributionTags0(DatabricksSparkUsageLogger.scala:88)
	at com.databricks.spark.util.DatabricksSparkUsageLogger.withAttributionTags(DatabricksSparkUsageLogger.scala:191)
	at com.databricks.spark.util.UsageLogging.$anonfun$withAttributionTags$1(UsageLogger.scala:617)
	at com.databricks.spark.util.UsageLogging$.withAttributionTags(UsageLogger.scala:729)
	at com.databricks.spark.util.UsageLogging$.withAttributionTags(UsageLogger.scala:738)
	at com.databricks.spark.util.UsageLogging.withAttributionTags(UsageLogger.scala:617)
	at com.databricks.spark.util.UsageLogging.withAttributionTags$(UsageLogger.scala:615)
	at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.withAttributionTags(SparkExecuteStatementOperation.scala:71)
	at org.apache.spark.sql.hive.thriftserver.ThriftLocalProperties.$anonfun$withLocalProperties$12(ThriftLocalProperties.scala:234)
	at com.databricks.spark.util.IdentityClaim$.withClaim(IdentityClaim.scala:48)
	at org.apache.spark.sql.hive.thriftserver.ThriftLocalProperties.withLocalProperties(ThriftLocalProperties.scala:229)
	at org.apache.spark.sql.hive.thriftserver.ThriftLocalProperties.withLocalProperties$(ThriftLocalProperties.scala:89)
	at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.withLocalProperties(SparkExecuteStatementOperation.scala:71)
	at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2$$anon$3.run(SparkExecuteStatementOperation.scala:463)
	at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2$$anon$3.run(SparkExecuteStatementOperation.scala:449)
	at java.base/java.security.AccessController.doPrivileged(AccessController.java:712)
	at java.base/javax.security.auth.Subject.doAs(Subject.java:439)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1899)
	at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2.run(SparkExecuteStatementOperation.scala:499)
	at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539)
	at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
	at java.base/java.lang.Thread.run(Thread.java:840)
Caused by: com.databricks.sql.transaction.tahoe.DeltaAnalysisException: [DELTA_CREATE_TABLE_WITH_DIFFERENT_PROPERTY] The specified properties do not match the existing properties at s3://databricks-th/partitioned_delta_table.

== Specified ==
delta.columnMapping.mode=name
delta.enableDeletionVectors=true
delta.feature.allowcolumndefaults=supported
this.is.my.key=12
this.is.my.key2=true

== Existing ==
delta.enableDeletionVectors=true
this.is.my.key=12
this.is.my.key2=true

	at com.databricks.sql.transaction.tahoe.DeltaErrorsBase.createTableWithDifferentPropertiesException(DeltaErrors.scala:1402)
	at com.databricks.sql.transaction.tahoe.DeltaErrorsBase.createTableWithDifferentPropertiesException$(DeltaErrors.scala:1398)
	at com.databricks.sql.transaction.tahoe.DeltaErrors$.createTableWithDifferentPropertiesException(DeltaErrors.scala:3573)
	at com.databricks.sql.transaction.tahoe.commands.CreateDeltaTableCommand.verifyTableMetadata(CreateDeltaTableCommand.scala:859)
	at com.databricks.sql.transaction.tahoe.commands.CreateDeltaTableCommand.createActionsForNewTableOrVerify$1(CreateDeltaTableCommand.scala:601)
	at com.databricks.sql.transaction.tahoe.commands.CreateDeltaTableCommand.handleCreateTable(CreateDeltaTableCommand.scala:610)
	at com.databricks.sql.transaction.tahoe.commands.CreateDeltaTableCommand.$anonfun$handleCommit$1(CreateDeltaTableCommand.scala:327)
	at com.databricks.sql.transaction.tahoe.OptimisticTransaction$.withActive(OptimisticTransaction.scala:215)
	at com.databricks.sql.transaction.tahoe.commands.CreateDeltaTableCommand.handleCommit(CreateDeltaTableCommand.scala:282)
	at com.databricks.sql.transaction.tahoe.commands.CreateDeltaTableCommand.$anonfun$run$3(CreateDeltaTableCommand.scala:216)
	at com.databricks.sql.transaction.tahoe.metering.DeltaLogging.withOperationTypeTag(DeltaLogging.scala:227)
	at com.databricks.sql.transaction.tahoe.metering.DeltaLogging.withOperationTypeTag$(DeltaLogging.scala:214)
	at com.databricks.sql.transaction.tahoe.commands.CreateDeltaTableCommand.withOperationTypeTag(CreateDeltaTableCommand.scala:79)
	at com.databricks.sql.transaction.tahoe.metering.DeltaLogging.$anonfun$recordDeltaOperationInternal$2(DeltaLogging.scala:166)
	at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:94)
	at com.databricks.sql.transaction.tahoe.metering.DeltaLogging.recordFrameProfile(DeltaLogging.scala:296)
	at com.databricks.sql.transaction.tahoe.metering.DeltaLogging.recordFrameProfile$(DeltaLogging.scala:294)
	at com.databricks.sql.transaction.tahoe.commands.CreateDeltaTableCommand.recordFrameProfile(CreateDeltaTableCommand.scala:79)
	at com.databricks.sql.transaction.tahoe.metering.DeltaLogging.$anonfun$recordDeltaOperationInternal$1(DeltaLogging.scala:165)
	at com.databricks.logging.UsageLogging.$anonfun$recordOperation$1(UsageLogging.scala:528)
	at com.databricks.logging.UsageLogging.executeThunkAndCaptureResultTags$1(UsageLogging.scala:633)
	at com.databricks.logging.UsageLogging.$anonfun$recordOperationWithResultTags$4(UsageLogging.scala:656)
	at com.databricks.logging.AttributionContextTracing.$anonfun$withAttributionContext$1(AttributionContextTracing.scala:48)
	at com.databricks.logging.AttributionContext$.$anonfun$withValue$1(AttributionContext.scala:276)
	at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)
	at com.databricks.logging.AttributionContext$.withValue(AttributionContext.scala:272)
	at com.databricks.logging.AttributionContextTracing.withAttributionContext(AttributionContextTracing.scala:46)
	at com.databricks.logging.AttributionContextTracing.withAttributionContext$(AttributionContextTracing.scala:43)
	at com.databricks.spark.util.PublicDBLogging.withAttributionContext(DatabricksSparkUsageLogger.scala:29)
	at com.databricks.logging.AttributionContextTracing.withAttributionTags(AttributionContextTracing.scala:95)
	at com.databricks.logging.AttributionContextTracing.withAttributionTags$(AttributionContextTracing.scala:76)
	at com.databricks.spark.util.PublicDBLogging.withAttributionTags(DatabricksSparkUsageLogger.scala:29)
	at com.databricks.logging.UsageLogging.recordOperationWithResultTags(UsageLogging.scala:628)
	at com.databricks.logging.UsageLogging.recordOperationWithResultTags$(UsageLogging.scala:537)
	at com.databricks.spark.util.PublicDBLogging.recordOperationWithResultTags(DatabricksSparkUsageLogger.scala:29)
	at com.databricks.logging.UsageLogging.recordOperation(UsageLogging.scala:529)
	at com.databricks.logging.UsageLogging.recordOperation$(UsageLogging.scala:495)
	at com.databricks.spark.util.PublicDBLogging.recordOperation(DatabricksSparkUsageLogger.scala:29)
	at com.databricks.spark.util.PublicDBLogging.recordOperation0(DatabricksSparkUsageLogger.scala:84)
	at com.databricks.spark.util.DatabricksSparkUsageLogger.recordOperation(DatabricksSparkUsageLogger.scala:169)
	at com.databricks.spark.util.UsageLogger.recordOperation(UsageLogger.scala:70)
	at com.databricks.spark.util.UsageLogger.recordOperation$(UsageLogger.scala:57)
	at com.databricks.spark.util.DatabricksSparkUsageLogger.recordOperation(DatabricksSparkUsageLogger.scala:128)
	at com.databricks.spark.util.UsageLogging.recordOperation(UsageLogger.scala:511)
	at com.databricks.spark.util.UsageLogging.recordOperation$(UsageLogger.scala:490)
	at com.databricks.sql.transaction.tahoe.commands.CreateDeltaTableCommand.recordOperation(CreateDeltaTableCommand.scala:79)
	at com.databricks.sql.transaction.tahoe.metering.DeltaLogging.recordDeltaOperationInternal(DeltaLogging.scala:164)
	at com.databricks.sql.transaction.tahoe.metering.DeltaLogging.recordDeltaOperation(DeltaLogging.scala:154)
	at com.databricks.sql.transaction.tahoe.metering.DeltaLogging.recordDeltaOperation$(DeltaLogging.scala:144)
	at com.databricks.sql.transaction.tahoe.commands.CreateDeltaTableCommand.recordDeltaOperation(CreateDeltaTableCommand.scala:79)
	at com.databricks.sql.transaction.tahoe.commands.CreateDeltaTableCommand.run(CreateDeltaTableCommand.scala:194)
	at com.databricks.sql.transaction.tahoe.catalog.DeltaCatalog.$anonfun$createDeltaTable$1(DeltaCatalog.scala:368)
	at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:94)
	at com.databricks.sql.transaction.tahoe.metering.DeltaLogging.recordFrameProfile(DeltaLogging.scala:296)
	at com.databricks.sql.transaction.tahoe.metering.DeltaLogging.recordFrameProfile$(DeltaLogging.scala:294)
	at com.databricks.sql.transaction.tahoe.catalog.DeltaCatalog.recordFrameProfile(DeltaCatalog.scala:121)
	at com.databricks.sql.transaction.tahoe.catalog.DeltaCatalog.com$databricks$sql$transaction$tahoe$catalog$DeltaCatalog$$createDeltaTable(DeltaCatalog.scala:162)
	at com.databricks.sql.transaction.tahoe.catalog.DeltaCatalog.$anonfun$createTableWithRowColumnControls$1(DeltaCatalog.scala:831)
	at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:94)
	at com.databricks.sql.transaction.tahoe.metering.DeltaLogging.recordFrameProfile(DeltaLogging.scala:296)
	at com.databricks.sql.transaction.tahoe.metering.DeltaLogging.recordFrameProfile$(DeltaLogging.scala:294)
	at com.databricks.sql.transaction.tahoe.catalog.DeltaCatalog.recordFrameProfile(DeltaCatalog.scala:121)
	at com.databricks.sql.transaction.tahoe.catalog.DeltaCatalog.createTableWithRowColumnControls(DeltaCatalog.scala:799)
	at com.databricks.sql.transaction.tahoe.catalog.DeltaCatalog.createTable(DeltaCatalog.scala:789)
	at com.databricks.sql.managedcatalog.UnityCatalogV2Proxy.createTable(UnityCatalogV2Proxy.scala:229)
	at org.apache.spark.sql.connector.catalog.TableCatalog.createTable(TableCatalog.java:246)
	at org.apache.spark.sql.execution.datasources.v2.CreateTableExec.run(CreateTableExec.scala:58)
	at org.apache.spark.sql.execution.datasources.v2.V2CommandExec.$anonfun$result$2(V2CommandExec.scala:48)
	at org.apache.spark.sql.execution.SparkPlan.runCommandWithAetherOff(SparkPlan.scala:180)
	at org.apache.spark.sql.execution.SparkPlan.runCommandInAetherOrSpark(SparkPlan.scala:191)
	at org.apache.spark.sql.execution.datasources.v2.V2CommandExec.$anonfun$result$1(V2CommandExec.scala:48)
	at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:94)
	at org.apache.spark.sql.execution.datasources.v2.V2CommandExec.result$lzycompute(V2CommandExec.scala:47)
	at org.apache.spark.sql.execution.datasources.v2.V2CommandExec.result(V2CommandExec.scala:45)
	at org.apache.spark.sql.execution.datasources.v2.V2CommandExec.executeCollect(V2CommandExec.scala:56)
	at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$5(QueryExecution.scala:385)
	at com.databricks.util.LexicalThreadLocal$Handle.runWith(LexicalThreadLocal.scala:63)
	at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$4(QueryExecution.scala:385)
	at org.apache.spark.sql.catalyst.QueryPlanningTracker$.withTracker(QueryPlanningTracker.scala:182)
	at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$3(QueryExecution.scala:385)
	at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId0$10(SQLExecution.scala:462)
	at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:800)
	at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId0$1(SQLExecution.scala:334)
	at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:1180)
	at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId0(SQLExecution.scala:205)
	at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:737)
	at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$2(QueryExecution.scala:381)
	at org.apache.spark.sql.execution.QueryExecution$.withInternalError(QueryExecution.scala:1179)
	at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$1(QueryExecution.scala:377)
	at org.apache.spark.sql.execution.QueryExecution.org$apache$spark$sql$execution$QueryExecution$$withMVTagsIfNecessary(QueryExecution.scala:327)
	at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.applyOrElse(QueryExecution.scala:374)
	at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.applyOrElse(QueryExecution.scala:349)
	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$1(TreeNode.scala:505)
	at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(origin.scala:85)
	at org.apache.spark.sql.catalyst.trees.TreeNode.transformDownWithPruning(TreeNode.scala:505)
	at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.org$apache$spark$sql$catalyst$plans$logical$AnalysisHelper$$super$transformDownWithPruning(LogicalPlan.scala:40)
	at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning(AnalysisHelper.scala:379)
	at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning$(AnalysisHelper.scala:375)
	at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:40)
	at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:40)
	at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:481)
	at org.apache.spark.sql.execution.QueryExecution.$anonfun$eagerlyExecuteCommands$1(QueryExecution.scala:349)
	at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.allowInvokingTransformsInAnalyzer(AnalysisHelper.scala:436)
	at org.apache.spark.sql.execution.QueryExecution.eagerlyExecuteCommands(QueryExecution.scala:349)
	at org.apache.spark.sql.execution.QueryExecution.commandExecuted$lzycompute(QueryExecution.scala:286)
	at org.apache.spark.sql.execution.QueryExecution.commandExecuted(QueryExecution.scala:283)
	at org.apache.spark.sql.Dataset.<init>(Dataset.scala:343)
	at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.$anonfun$analyzeQuery$2(SparkExecuteStatementOperation.scala:636)
	at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:1180)
	at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.$anonfun$analyzeQuery$1(SparkExecuteStatementOperation.scala:607)
	at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.getOrCreateDF(SparkExecuteStatementOperation.scala:593)
	at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.analyzeQuery(SparkExecuteStatementOperation.scala:607)
	at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.$anonfun$execute$5(SparkExecuteStatementOperation.scala:680)
	at org.apache.spark.util.Utils$.timeTakenMs(Utils.scala:537)
	at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.$anonfun$execute$1(SparkExecuteStatementOperation.scala:680)
	... 43 more
. [Failed SQL: (500051) CREATE TABLE main.liquibase_harness_test_ds.partitioned_delta_table (id INT, name VARCHAR(20), some_column BIGINT) USING delta TBLPROPERTIES('delta.feature.allowColumnDefaults' = 'supported', 'delta.columnMapping.mode' = 'name', 'delta.enableDeletionVectors' = true, 'this.is.my.key' = 12, 'this.is.my.key2' = true) LOCATION 's3://databricks-th/partitioned_delta_table' PARTITIONED BY (id, some_column)]
	at org.junit.jupiter.api.AssertionUtils.fail(AssertionUtils.java:38)
	at org.junit.jupiter.api.Assertions.fail(Assertions.java:138)
	at liquibase.harness.util.TestUtils.executeCommandScope(TestUtils.groovy:66)
	at liquibase.harness.change.ChangeObjectTests.apply #testInput.changeObject against #testInput.databaseName #testInput.version(ChangeObjectTests.groovy:76)

Check failure on line 0 in target/surefire-reports/TEST-liquibase.harness.change.ChangeObjectTests.xml

See this annotation in the file changed.

@github-actions github-actions / Liquibase Test Harness - Contributed Reports

liquibase.harness.change.ChangeObjectTests ► apply createTable against databricks 3.1

Failed test found in:
  target/surefire-reports/TEST-liquibase.harness.change.ChangeObjectTests.xml
Error:
  org.opentest4j.AssertionFailedError: 
Raw output
org.opentest4j.AssertionFailedError: 
liquibase.exception.LiquibaseException: liquibase.exception.MigrationFailedException: Migration failed for changeset liquibase/harness/change/changelogs/databricks/createTable.xml::2::as:
     Reason: liquibase.exception.DatabaseException: [Databricks][JDBCDriver](500051) ERROR processing query/statement. Error Code: 0, SQL state: 42KD7, Query: CREATE TAB***, Error message from Server: org.apache.hive.service.cli.HiveSQLException: Error running query: [DELTA_CREATE_TABLE_WITH_DIFFERENT_PROPERTY] com.databricks.sql.transaction.tahoe.DeltaAnalysisException: [DELTA_CREATE_TABLE_WITH_DIFFERENT_PROPERTY] The specified properties do not match the existing properties at s3://databricks-th/test_table_properties.

== Specified ==
delta.columnMapping.mode=name
delta.enableDeletionVectors=true
delta.feature.allowcolumndefaults=supported
this.is.my.key=12
this.is.my.key2=true

== Existing ==
delta.enableDeletionVectors=true
this.is.my.key=12
this.is.my.key2=true

	at org.apache.spark.sql.hive.thriftserver.HiveThriftServerErrors$.runningQueryError(HiveThriftServerErrors.scala:49)
	at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.$anonfun$execute$1(SparkExecuteStatementOperation.scala:805)
	at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
	at com.databricks.unity.UCSEphemeralState$Handle.runWith(UCSEphemeralState.scala:51)
	at com.databricks.unity.HandleImpl.runWith(UCSHandle.scala:104)
	at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.org$apache$spark$sql$hive$thriftserver$SparkExecuteStatementOperation$$execute(SparkExecuteStatementOperation.scala:641)
	at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2$$anon$3.$anonfun$run$5(SparkExecuteStatementOperation.scala:486)
	at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
	at org.apache.spark.sql.execution.SQLExecution$.withRootExecution(SQLExecution.scala:711)
	at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2$$anon$3.$anonfun$run$2(SparkExecuteStatementOperation.scala:486)
	at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
	at com.databricks.logging.AttributionContextTracing.$anonfun$withAttributionContext$1(AttributionContextTracing.scala:48)
	at com.databricks.logging.AttributionContext$.$anonfun$withValue$1(AttributionContext.scala:276)
	at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)
	at com.databricks.logging.AttributionContext$.withValue(AttributionContext.scala:272)
	at com.databricks.logging.AttributionContextTracing.withAttributionContext(AttributionContextTracing.scala:46)
	at com.databricks.logging.AttributionContextTracing.withAttributionContext$(AttributionContextTracing.scala:43)
	at com.databricks.spark.util.PublicDBLogging.withAttributionContext(DatabricksSparkUsageLogger.scala:29)
	at com.databricks.logging.AttributionContextTracing.withAttributionTags(AttributionContextTracing.scala:95)
	at com.databricks.logging.AttributionContextTracing.withAttributionTags$(AttributionContextTracing.scala:76)
	at com.databricks.spark.util.PublicDBLogging.withAttributionTags(DatabricksSparkUsageLogger.scala:29)
	at com.databricks.spark.util.PublicDBLogging.withAttributionTags0(DatabricksSparkUsageLogger.scala:88)
	at com.databricks.spark.util.DatabricksSparkUsageLogger.withAttributionTags(DatabricksSparkUsageLogger.scala:191)
	at com.databricks.spark.util.UsageLogging.$anonfun$withAttributionTags$1(UsageLogger.scala:617)
	at com.databricks.spark.util.UsageLogging$.withAttributionTags(UsageLogger.scala:729)
	at com.databricks.spark.util.UsageLogging$.withAttributionTags(UsageLogger.scala:738)
	at com.databricks.spark.util.UsageLogging.withAttributionTags(UsageLogger.scala:617)
	at com.databricks.spark.util.UsageLogging.withAttributionTags$(UsageLogger.scala:615)
	at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.withAttributionTags(SparkExecuteStatementOperation.scala:71)
	at org.apache.spark.sql.hive.thriftserver.ThriftLocalProperties.$anonfun$withLocalProperties$12(ThriftLocalProperties.scala:234)
	at com.databricks.spark.util.IdentityClaim$.withClaim(IdentityClaim.scala:48)
	at org.apache.spark.sql.hive.thriftserver.ThriftLocalProperties.withLocalProperties(ThriftLocalProperties.scala:229)
	at org.apache.spark.sql.hive.thriftserver.ThriftLocalProperties.withLocalProperties$(ThriftLocalProperties.scala:89)
	at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.withLocalProperties(SparkExecuteStatementOperation.scala:71)
	at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2$$anon$3.run(SparkExecuteStatementOperation.scala:463)
	at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2$$anon$3.run(SparkExecuteStatementOperation.scala:449)
	at java.base/java.security.AccessController.doPrivileged(AccessController.java:712)
	at java.base/javax.security.auth.Subject.doAs(Subject.java:439)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1899)
	at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation$$anon$2.run(SparkExecuteStatementOperation.scala:499)
	at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539)
	at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
	at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
	at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
	at java.base/java.lang.Thread.run(Thread.java:840)
Caused by: com.databricks.sql.transaction.tahoe.DeltaAnalysisException: [DELTA_CREATE_TABLE_WITH_DIFFERENT_PROPERTY] The specified properties do not match the existing properties at s3://databricks-th/test_table_properties.

== Specified ==
delta.columnMapping.mode=name
delta.enableDeletionVectors=true
delta.feature.allowcolumndefaults=supported
this.is.my.key=12
this.is.my.key2=true

== Existing ==
delta.enableDeletionVectors=true
this.is.my.key=12
this.is.my.key2=true

	at com.databricks.sql.transaction.tahoe.DeltaErrorsBase.createTableWithDifferentPropertiesException(DeltaErrors.scala:1402)
	at com.databricks.sql.transaction.tahoe.DeltaErrorsBase.createTableWithDifferentPropertiesException$(DeltaErrors.scala:1398)
	at com.databricks.sql.transaction.tahoe.DeltaErrors$.createTableWithDifferentPropertiesException(DeltaErrors.scala:3573)
	at com.databricks.sql.transaction.tahoe.commands.CreateDeltaTableCommand.verifyTableMetadata(CreateDeltaTableCommand.scala:859)
	at com.databricks.sql.transaction.tahoe.commands.CreateDeltaTableCommand.createActionsForNewTableOrVerify$1(CreateDeltaTableCommand.scala:601)
	at com.databricks.sql.transaction.tahoe.commands.CreateDeltaTableCommand.handleCreateTable(CreateDeltaTableCommand.scala:610)
	at com.databricks.sql.transaction.tahoe.commands.CreateDeltaTableCommand.$anonfun$handleCommit$1(CreateDeltaTableCommand.scala:327)
	at com.databricks.sql.transaction.tahoe.OptimisticTransaction$.withActive(OptimisticTransaction.scala:215)
	at com.databricks.sql.transaction.tahoe.commands.CreateDeltaTableCommand.handleCommit(CreateDeltaTableCommand.scala:282)
	at com.databricks.sql.transaction.tahoe.commands.CreateDeltaTableCommand.$anonfun$run$3(CreateDeltaTableCommand.scala:216)
	at com.databricks.sql.transaction.tahoe.metering.DeltaLogging.withOperationTypeTag(DeltaLogging.scala:227)
	at com.databricks.sql.transaction.tahoe.metering.DeltaLogging.withOperationTypeTag$(DeltaLogging.scala:214)
	at com.databricks.sql.transaction.tahoe.commands.CreateDeltaTableCommand.withOperationTypeTag(CreateDeltaTableCommand.scala:79)
	at com.databricks.sql.transaction.tahoe.metering.DeltaLogging.$anonfun$recordDeltaOperationInternal$2(DeltaLogging.scala:166)
	at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:94)
	at com.databricks.sql.transaction.tahoe.metering.DeltaLogging.recordFrameProfile(DeltaLogging.scala:296)
	at com.databricks.sql.transaction.tahoe.metering.DeltaLogging.recordFrameProfile$(DeltaLogging.scala:294)
	at com.databricks.sql.transaction.tahoe.commands.CreateDeltaTableCommand.recordFrameProfile(CreateDeltaTableCommand.scala:79)
	at com.databricks.sql.transaction.tahoe.metering.DeltaLogging.$anonfun$recordDeltaOperationInternal$1(DeltaLogging.scala:165)
	at com.databricks.logging.UsageLogging.$anonfun$recordOperation$1(UsageLogging.scala:528)
	at com.databricks.logging.UsageLogging.executeThunkAndCaptureResultTags$1(UsageLogging.scala:633)
	at com.databricks.logging.UsageLogging.$anonfun$recordOperationWithResultTags$4(UsageLogging.scala:656)
	at com.databricks.logging.AttributionContextTracing.$anonfun$withAttributionContext$1(AttributionContextTracing.scala:48)
	at com.databricks.logging.AttributionContext$.$anonfun$withValue$1(AttributionContext.scala:276)
	at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62)
	at com.databricks.logging.AttributionContext$.withValue(AttributionContext.scala:272)
	at com.databricks.logging.AttributionContextTracing.withAttributionContext(AttributionContextTracing.scala:46)
	at com.databricks.logging.AttributionContextTracing.withAttributionContext$(AttributionContextTracing.scala:43)
	at com.databricks.spark.util.PublicDBLogging.withAttributionContext(DatabricksSparkUsageLogger.scala:29)
	at com.databricks.logging.AttributionContextTracing.withAttributionTags(AttributionContextTracing.scala:95)
	at com.databricks.logging.AttributionContextTracing.withAttributionTags$(AttributionContextTracing.scala:76)
	at com.databricks.spark.util.PublicDBLogging.withAttributionTags(DatabricksSparkUsageLogger.scala:29)
	at com.databricks.logging.UsageLogging.recordOperationWithResultTags(UsageLogging.scala:628)
	at com.databricks.logging.UsageLogging.recordOperationWithResultTags$(UsageLogging.scala:537)
	at com.databricks.spark.util.PublicDBLogging.recordOperationWithResultTags(DatabricksSparkUsageLogger.scala:29)
	at com.databricks.logging.UsageLogging.recordOperation(UsageLogging.scala:529)
	at com.databricks.logging.UsageLogging.recordOperation$(UsageLogging.scala:495)
	at com.databricks.spark.util.PublicDBLogging.recordOperation(DatabricksSparkUsageLogger.scala:29)
	at com.databricks.spark.util.PublicDBLogging.recordOperation0(DatabricksSparkUsageLogger.scala:84)
	at com.databricks.spark.util.DatabricksSparkUsageLogger.recordOperation(DatabricksSparkUsageLogger.scala:169)
	at com.databricks.spark.util.UsageLogger.recordOperation(UsageLogger.scala:70)
	at com.databricks.spark.util.UsageLogger.recordOperation$(UsageLogger.scala:57)
	at com.databricks.spark.util.DatabricksSparkUsageLogger.recordOperation(DatabricksSparkUsageLogger.scala:128)
	at com.databricks.spark.util.UsageLogging.recordOperation(UsageLogger.scala:511)
	at com.databricks.spark.util.UsageLogging.recordOperation$(UsageLogger.scala:490)
	at com.databricks.sql.transaction.tahoe.commands.CreateDeltaTableCommand.recordOperation(CreateDeltaTableCommand.scala:79)
	at com.databricks.sql.transaction.tahoe.metering.DeltaLogging.recordDeltaOperationInternal(DeltaLogging.scala:164)
	at com.databricks.sql.transaction.tahoe.metering.DeltaLogging.recordDeltaOperation(DeltaLogging.scala:154)
	at com.databricks.sql.transaction.tahoe.metering.DeltaLogging.recordDeltaOperation$(DeltaLogging.scala:144)
	at com.databricks.sql.transaction.tahoe.commands.CreateDeltaTableCommand.recordDeltaOperation(CreateDeltaTableCommand.scala:79)
	at com.databricks.sql.transaction.tahoe.commands.CreateDeltaTableCommand.run(CreateDeltaTableCommand.scala:194)
	at com.databricks.sql.transaction.tahoe.catalog.DeltaCatalog.$anonfun$createDeltaTable$1(DeltaCatalog.scala:368)
	at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:94)
	at com.databricks.sql.transaction.tahoe.metering.DeltaLogging.recordFrameProfile(DeltaLogging.scala:296)
	at com.databricks.sql.transaction.tahoe.metering.DeltaLogging.recordFrameProfile$(DeltaLogging.scala:294)
	at com.databricks.sql.transaction.tahoe.catalog.DeltaCatalog.recordFrameProfile(DeltaCatalog.scala:121)
	at com.databricks.sql.transaction.tahoe.catalog.DeltaCatalog.com$databricks$sql$transaction$tahoe$catalog$DeltaCatalog$$createDeltaTable(DeltaCatalog.scala:162)
	at com.databricks.sql.transaction.tahoe.catalog.DeltaCatalog.$anonfun$createTableWithRowColumnControls$1(DeltaCatalog.scala:831)
	at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:94)
	at com.databricks.sql.transaction.tahoe.metering.DeltaLogging.recordFrameProfile(DeltaLogging.scala:296)
	at com.databricks.sql.transaction.tahoe.metering.DeltaLogging.recordFrameProfile$(DeltaLogging.scala:294)
	at com.databricks.sql.transaction.tahoe.catalog.DeltaCatalog.recordFrameProfile(DeltaCatalog.scala:121)
	at com.databricks.sql.transaction.tahoe.catalog.DeltaCatalog.createTableWithRowColumnControls(DeltaCatalog.scala:799)
	at com.databricks.sql.transaction.tahoe.catalog.DeltaCatalog.createTable(DeltaCatalog.scala:789)
	at com.databricks.sql.managedcatalog.UnityCatalogV2Proxy.createTable(UnityCatalogV2Proxy.scala:229)
	at org.apache.spark.sql.connector.catalog.TableCatalog.createTable(TableCatalog.java:246)
	at org.apache.spark.sql.execution.datasources.v2.CreateTableExec.run(CreateTableExec.scala:58)
	at org.apache.spark.sql.execution.datasources.v2.V2CommandExec.$anonfun$result$2(V2CommandExec.scala:48)
	at org.apache.spark.sql.execution.SparkPlan.runCommandWithAetherOff(SparkPlan.scala:180)
	at org.apache.spark.sql.execution.SparkPlan.runCommandInAetherOrSpark(SparkPlan.scala:191)
	at org.apache.spark.sql.execution.datasources.v2.V2CommandExec.$anonfun$result$1(V2CommandExec.scala:48)
	at com.databricks.spark.util.FrameProfiler$.record(FrameProfiler.scala:94)
	at org.apache.spark.sql.execution.datasources.v2.V2CommandExec.result$lzycompute(V2CommandExec.scala:47)
	at org.apache.spark.sql.execution.datasources.v2.V2CommandExec.result(V2CommandExec.scala:45)
	at org.apache.spark.sql.execution.datasources.v2.V2CommandExec.executeCollect(V2CommandExec.scala:56)
	at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$5(QueryExecution.scala:385)
	at com.databricks.util.LexicalThreadLocal$Handle.runWith(LexicalThreadLocal.scala:63)
	at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$4(QueryExecution.scala:385)
	at org.apache.spark.sql.catalyst.QueryPlanningTracker$.withTracker(QueryPlanningTracker.scala:182)
	at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$3(QueryExecution.scala:385)
	at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId0$10(SQLExecution.scala:462)
	at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:800)
	at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId0$1(SQLExecution.scala:334)
	at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:1180)
	at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId0(SQLExecution.scala:205)
	at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:737)
	at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$2(QueryExecution.scala:381)
	at org.apache.spark.sql.execution.QueryExecution$.withInternalError(QueryExecution.scala:1179)
	at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.$anonfun$applyOrElse$1(QueryExecution.scala:377)
	at org.apache.spark.sql.execution.QueryExecution.org$apache$spark$sql$execution$QueryExecution$$withMVTagsIfNecessary(QueryExecution.scala:327)
	at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.applyOrElse(QueryExecution.scala:374)
	at org.apache.spark.sql.execution.QueryExecution$$anonfun$$nestedInanonfun$eagerlyExecuteCommands$1$1.applyOrElse(QueryExecution.scala:349)
	at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$1(TreeNode.scala:505)
	at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(origin.scala:85)
	at org.apache.spark.sql.catalyst.trees.TreeNode.transformDownWithPruning(TreeNode.scala:505)
	at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.org$apache$spark$sql$catalyst$plans$logical$AnalysisHelper$$super$transformDownWithPruning(LogicalPlan.scala:40)
	at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning(AnalysisHelper.scala:379)
	at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning$(AnalysisHelper.scala:375)
	at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:40)
	at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:40)
	at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:481)
	at org.apache.spark.sql.execution.QueryExecution.$anonfun$eagerlyExecuteCommands$1(QueryExecution.scala:349)
	at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.allowInvokingTransformsInAnalyzer(AnalysisHelper.scala:436)
	at org.apache.spark.sql.execution.QueryExecution.eagerlyExecuteCommands(QueryExecution.scala:349)
	at org.apache.spark.sql.execution.QueryExecution.commandExecuted$lzycompute(QueryExecution.scala:286)
	at org.apache.spark.sql.execution.QueryExecution.commandExecuted(QueryExecution.scala:283)
	at org.apache.spark.sql.Dataset.<init>(Dataset.scala:343)
	at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.$anonfun$analyzeQuery$2(SparkExecuteStatementOperation.scala:636)
	at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:1180)
	at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.$anonfun$analyzeQuery$1(SparkExecuteStatementOperation.scala:607)
	at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.getOrCreateDF(SparkExecuteStatementOperation.scala:593)
	at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.analyzeQuery(SparkExecuteStatementOperation.scala:607)
	at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.$anonfun$execute$5(SparkExecuteStatementOperation.scala:680)
	at org.apache.spark.util.Utils$.timeTakenMs(Utils.scala:537)
	at org.apache.spark.sql.hive.thriftserver.SparkExecuteStatementOperation.$anonfun$execute$1(SparkExecuteStatementOperation.scala:680)
	... 43 more
. [Failed SQL: (500051) CREATE TABLE main.liquibase_harness_test_ds.test_table_properties (test_id INT NOT NULL, CONSTRAINT PK_TEST_TABLE_PROPERTIES PRIMARY KEY (test_id)) USING delta TBLPROPERTIES('delta.feature.allowColumnDefaults' = 'supported', 'delta.columnMapping.mode' = 'name', 'delta.enableDeletionVectors' = true, 'this.is.my.key' = 12, 'this.is.my.key2' = true) LOCATION 's3://databricks-th/test_table_properties']
	at org.junit.jupiter.api.AssertionUtils.fail(AssertionUtils.java:38)
	at org.junit.jupiter.api.Assertions.fail(Assertions.java:138)
	at liquibase.harness.util.TestUtils.executeCommandScope(TestUtils.groovy:66)
	at liquibase.harness.change.ChangeObjectTests.apply #testInput.changeObject against #testInput.databaseName #testInput.version(ChangeObjectTests.groovy:76)