amoro icon indicating copy to clipboard operation
amoro copied to clipboard

[Bug]: Show table return table, but can't drop it or load it.

Open baiyangtx opened this issue 3 years ago • 2 comments

What happened?

create table by spark sql, but failed for kerberos reason. so there no hive table in hms but create table meta successfully at ams.

spark show tables command will show this table. but spark drop table command will throw TableNotExistException.

and this table is not shown at ams page.

Affects Versions

under 0.3.2

What engines are you seeing the problem on?

Core

How to reproduce

No response

Relevant log output

No response

Anything else

  1. can't create arctic table metadata if create hive table failed.
  2. thirft api listTables should return tables as ams page.
  3. show sync table delete if someone drop hive table from hiveserver2 or the other ways.
  4. ams need check CreateTableMeta parameter that table is able to load.
  5. if table metadata has some problem, drop table should run success.

Code of Conduct

  • [X] I agree to follow this project's Code of Conduct

baiyangtx avatar Oct 19 '22 11:10 baiyangtx

spark-sql> show tables ; arctic create_test_0829 arctic dim_left_table arctic dim_right_table arctic hive_table_test arctic hive_table_test_3 arctic hive_table_test_4 arctic hive_table_test_9 arctic iowkey arctic no_pk_no_partition_10 arctic no_pk_no_partition_11 arctic no_pk_no_partition_12 arctic no_pk_no_partition_4 arctic no_pk_no_partition_5 arctic no_pk_no_partition_6 arctic no_pk_no_partition_7 arctic no_pk_no_partition_8 arctic no_pk_no_partition_9 arctic no_pk_partition arctic no_pk_partition_2 arctic no_pk__partition_2 arctic ods_article_code_all_d arctic orders arctic orders_yxx arctic pk_no_partition arctic pk_partition arctic pk_unpart_animal_lt arctic sdf_hive_test_4 arctic sdf_hive_test_6 arctic sdf_hive_test_7 arctic sdf_test arctic sdf_test_1558 arctic temp_test_1057 arctic test_1015 arctic test_1016 arctic test_case_0830_0 arctic test_case_0830_1 arctic test_case_0830_2 arctic test_case_1 arctic test_case_2 arctic test_case_3 arctic test_case_4 arctic test_case_5 arctic test_create1036 Time taken: 0.126 seconds, Fetched 43 row(s) spark-sql> create table zyx_test ( id int , data string , pt string, primary key(id) ) using arctic ; ANTLR Tool version 4.7.2 used for code generation does not match the current runtime version 4.8ANTLR Runtime version 4.7.1 used for parser compilation does not match the current runtime version 4.8ANTLR Tool version 4.7.2 used for code generation does not match the current runtime version 4.8ANTLR Runtime version 4.7.1 used for parser compilation does not match the current runtime version 4.82022-10-19 19:14:48,436 [235329] - ERROR [main:Logging@94] - Failed in [create table zyx_test ( id int , data string , pt string, primary key(id) ) using arctic ] java.lang.IllegalStateException: update table meta failed at com.netease.arctic.catalog.BaseArcticCatalog$BaseArcticTableBuilder.createTableMeta(BaseArcticCatalog.java:487) at com.netease.arctic.catalog.BaseArcticCatalog$BaseArcticTableBuilder.create(BaseArcticCatalog.java:399) at com.netease.arctic.spark.ArcticSparkCatalog.createTable(ArcticSparkCatalog.java:202) at org.apache.spark.sql.execution.datasources.v2.CreateTableExec.run(CreateTableExec.scala:41) at org.apache.spark.sql.execution.datasources.v2.V2CommandExec.result$lzycompute(V2CommandExec.scala:40) at org.apache.spark.sql.execution.datasources.v2.V2CommandExec.result(V2CommandExec.scala:40) at org.apache.spark.sql.execution.datasources.v2.V2CommandExec.executeCollect(V2CommandExec.scala:46) at org.apache.spark.sql.Dataset.$anonfun$logicalPlan$1(Dataset.scala:228) at org.apache.spark.sql.Dataset.$anonfun$withAction$1(Dataset.scala:3700) at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$5(SQLExecution.scala:103) at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:163) at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:90) at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775) at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:64) at org.apache.spark.sql.Dataset.withAction(Dataset.scala:3698) at org.apache.spark.sql.Dataset.(Dataset.scala:228) at org.apache.spark.sql.Dataset$.$anonfun$ofRows$2(Dataset.scala:99) at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775) at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:96) at org.apache.spark.sql.SparkSession.$anonfun$sql$1(SparkSession.scala:618) at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775) at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:613) at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:650) at org.apache.spark.sql.hive.thriftserver.SparkSQLDriver.run(SparkSQLDriver.scala:67) at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.processCmd(SparkSQLCLIDriver.scala:381) at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.$anonfun$processLine$1(SparkSQLCLIDriver.scala:500) at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.$anonfun$processLine$1$adapted(SparkSQLCLIDriver.scala:494) at scala.collection.Iterator.foreach(Iterator.scala:941) at scala.collection.Iterator.foreach$(Iterator.scala:941) at scala.collection.AbstractIterator.foreach(Iterator.scala:1429) at scala.collection.IterableLike.foreach(IterableLike.scala:74) at scala.collection.IterableLike.foreach$(IterableLike.scala:73) at scala.collection.AbstractIterable.foreach(Iterable.scala:56) at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.processLine(SparkSQLCLIDriver.scala:494) at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.main(SparkSQLCLIDriver.scala:284) at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.main(SparkSQLCLIDriver.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52) at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:951) at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180) at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203) at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90) at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1039) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1048) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) Caused by: com.netease.arctic.shade.org.apache.thrift.transport.TTransportException: Socket is closed by peer. at com.netease.arctic.shade.org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:130) at com.netease.arctic.shade.org.apache.thrift.transport.TTransport.readAll(TTransport.java:86) at com.netease.arctic.shade.org.apache.thrift.transport.TFramedTransport.readFrame(TFramedTransport.java:141) at com.netease.arctic.shade.org.apache.thrift.transport.TFramedTransport.read(TFramedTransport.java:109) at com.netease.arctic.shade.org.apache.thrift.transport.TTransport.readAll(TTransport.java:86) at com.netease.arctic.shade.org.apache.thrift.protocol.TBinaryProtocol.readAll(TBinaryProtocol.java:455) at com.netease.arctic.shade.org.apache.thrift.protocol.TBinaryProtocol.readI32(TBinaryProtocol.java:354) at com.netease.arctic.shade.org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:243) at com.netease.arctic.shade.org.apache.thrift.protocol.TProtocolDecorator.readMessageBegin(TProtocolDecorator.java:135) at com.netease.arctic.shade.org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:77) at com.netease.arctic.ams.api.ArcticTableMetastore$Client.recv_createTableMeta(ArcticTableMetastore.java:256) at com.netease.arctic.ams.api.ArcticTableMetastore$Client.createTableMeta(ArcticTableMetastore.java:243) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at com.netease.arctic.ams.api.client.ThriftClientPool.lambda$iface$1(ThriftClientPool.java:264) at com.sun.proxy.$Proxy34.createTableMeta(Unknown Source) at com.netease.arctic.PooledAmsClient.createTableMeta(PooledAmsClient.java:80) at com.netease.arctic.catalog.BaseArcticCatalog$BaseArcticTableBuilder.createTableMeta(BaseArcticCatalog.java:482) ... 47 more java.lang.IllegalStateException: update table meta failed at com.netease.arctic.catalog.BaseArcticCatalog$BaseArcticTableBuilder.createTableMeta(BaseArcticCatalog.java:487) at com.netease.arctic.catalog.BaseArcticCatalog$BaseArcticTableBuilder.create(BaseArcticCatalog.java:399) at com.netease.arctic.spark.ArcticSparkCatalog.createTable(ArcticSparkCatalog.java:202) at org.apache.spark.sql.execution.datasources.v2.CreateTableExec.run(CreateTableExec.scala:41) at org.apache.spark.sql.execution.datasources.v2.V2CommandExec.result$lzycompute(V2CommandExec.scala:40) at org.apache.spark.sql.execution.datasources.v2.V2CommandExec.result(V2CommandExec.scala:40) at org.apache.spark.sql.execution.datasources.v2.V2CommandExec.executeCollect(V2CommandExec.scala:46) at org.apache.spark.sql.Dataset.$anonfun$logicalPlan$1(Dataset.scala:228) at org.apache.spark.sql.Dataset.$anonfun$withAction$1(Dataset.scala:3700) at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$5(SQLExecution.scala:103) at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:163) at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:90) at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775) at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:64) at org.apache.spark.sql.Dataset.withAction(Dataset.scala:3698) at org.apache.spark.sql.Dataset.(Dataset.scala:228) at org.apache.spark.sql.Dataset$.$anonfun$ofRows$2(Dataset.scala:99) at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775) at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:96) at org.apache.spark.sql.SparkSession.$anonfun$sql$1(SparkSession.scala:618) at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:775) at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:613) at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:650) at org.apache.spark.sql.hive.thriftserver.SparkSQLDriver.run(SparkSQLDriver.scala:67) at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.processCmd(SparkSQLCLIDriver.scala:381) at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.$anonfun$processLine$1(SparkSQLCLIDriver.scala:500) at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.$anonfun$processLine$1$adapted(SparkSQLCLIDriver.scala:494) at scala.collection.Iterator.foreach(Iterator.scala:941) at scala.collection.Iterator.foreach$(Iterator.scala:941) at scala.collection.AbstractIterator.foreach(Iterator.scala:1429) at scala.collection.IterableLike.foreach(IterableLike.scala:74) at scala.collection.IterableLike.foreach$(IterableLike.scala:73) at scala.collection.AbstractIterable.foreach(Iterable.scala:56) at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.processLine(SparkSQLCLIDriver.scala:494) at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.main(SparkSQLCLIDriver.scala:284) at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.main(SparkSQLCLIDriver.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52) at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:951) at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180) at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203) at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90) at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1039) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1048) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) Caused by: com.netease.arctic.shade.org.apache.thrift.transport.TTransportException: Socket is closed by peer. at com.netease.arctic.shade.org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:130) at com.netease.arctic.shade.org.apache.thrift.transport.TTransport.readAll(TTransport.java:86) at com.netease.arctic.shade.org.apache.thrift.transport.TFramedTransport.readFrame(TFramedTransport.java:141) at com.netease.arctic.shade.org.apache.thrift.transport.TFramedTransport.read(TFramedTransport.java:109) at com.netease.arctic.shade.org.apache.thrift.transport.TTransport.readAll(TTransport.java:86) at com.netease.arctic.shade.org.apache.thrift.protocol.TBinaryProtocol.readAll(TBinaryProtocol.java:455) at com.netease.arctic.shade.org.apache.thrift.protocol.TBinaryProtocol.readI32(TBinaryProtocol.java:354) at com.netease.arctic.shade.org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:243) at com.netease.arctic.shade.org.apache.thrift.protocol.TProtocolDecorator.readMessageBegin(TProtocolDecorator.java:135) at com.netease.arctic.shade.org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:77) at com.netease.arctic.ams.api.ArcticTableMetastore$Client.recv_createTableMeta(ArcticTableMetastore.java:256) at com.netease.arctic.ams.api.ArcticTableMetastore$Client.createTableMeta(ArcticTableMetastore.java:243) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at com.netease.arctic.ams.api.client.ThriftClientPool.lambda$iface$1(ThriftClientPool.java:264) at com.sun.proxy.$Proxy34.createTableMeta(Unknown Source) at com.netease.arctic.PooledAmsClient.createTableMeta(PooledAmsClient.java:80) at com.netease.arctic.catalog.BaseArcticCatalog$BaseArcticTableBuilder.createTableMeta(BaseArcticCatalog.java:482) ... 47 more

spark-sql> > create table zyx_test ( id int , data string , pt string, primary key(id) ) using arctic ; Error in query: Table arctic.zyx_test already exists spark-sql> > > show tables ; arctic create_test_0829 arctic dim_left_table arctic dim_right_table arctic hive_table_test arctic hive_table_test_3 arctic hive_table_test_4 arctic hive_table_test_9 arctic iowkey arctic no_pk_no_partition_10 arctic no_pk_no_partition_11 arctic no_pk_no_partition_12 arctic no_pk_no_partition_4 arctic no_pk_no_partition_5 arctic no_pk_no_partition_6 arctic no_pk_no_partition_7 arctic no_pk_no_partition_8 arctic no_pk_no_partition_9 arctic no_pk_partition arctic no_pk_partition_2 arctic no_pk__partition_2 arctic ods_article_code_all_d arctic orders arctic orders_yxx arctic pk_no_partition arctic pk_partition arctic pk_unpart_animal_lt arctic sdf_hive_test_4 arctic sdf_hive_test_6 arctic sdf_hive_test_7 arctic sdf_test arctic sdf_test_1558 arctic temp_test_1057 arctic test_1015 arctic test_1016 arctic test_case_0830_0 arctic test_case_0830_1 arctic test_case_0830_2 arctic test_case_1 arctic test_case_2 arctic test_case_3 arctic test_case_4 arctic test_case_5 arctic test_create1036 arctic zyx_test Time taken: 0.134 seconds, Fetched 44 row(s) spark-sql> drop table zyx_test ; Error in query: Table or view not found for 'DROP TABLE': zyx_test; line 1 pos 0; 'DropTable false, false +- 'UnresolvedTableOrView [zyx_test], DROP TABLE, true

spark-sql> create table zyx_test ( id int , data string , pt string, primary key(id) ) using arctic ; Error in query: Table arctic.zyx_test already exists spark-sql> > > show tables ; arctic create_test_0829 arctic dim_left_table arctic dim_right_table arctic hive_table_test arctic hive_table_test_3 arctic hive_table_test_4 arctic hive_table_test_9 arctic iowkey arctic no_pk_no_partition_10 arctic no_pk_no_partition_11 arctic no_pk_no_partition_12 arctic no_pk_no_partition_4 arctic no_pk_no_partition_5 arctic no_pk_no_partition_6 arctic no_pk_no_partition_7 arctic no_pk_no_partition_8 arctic no_pk_no_partition_9 arctic no_pk_partition arctic no_pk_partition_2 arctic no_pk__partition_2 arctic ods_article_code_all_d arctic orders arctic orders_yxx arctic pk_no_partition arctic pk_partition arctic pk_unpart_animal_lt arctic sdf_hive_test_4 arctic sdf_hive_test_6 arctic sdf_hive_test_7 arctic sdf_test arctic sdf_test_1558 arctic temp_test_1057 arctic test_1015 arctic test_1016 arctic test_case_0830_0 arctic test_case_0830_1 arctic test_case_0830_2 arctic test_case_1 arctic test_case_2 arctic test_case_3 arctic test_case_4 arctic test_case_5 arctic test_create1036 arctic zyx_test Time taken: 0.13 seconds, Fetched 44 row(s) spark-sql> drop table zyx_test ; Error in query: Table or view not found for 'DROP TABLE': zyx_test; line 1 pos 0; 'DropTable false, false +- 'UnresolvedTableOrView [zyx_test], DROP TABLE, true

baiyangtx avatar Oct 19 '22 12:10 baiyangtx

This issue has been automatically marked as stale because it has been open for 180 days with no activity. It will be closed in next 14 days if no further activity occurs. To permanently prevent this issue from being considered stale, add the label 'not-stale', but commenting on the issue is preferred when possible.

github-actions[bot] avatar Aug 20 '24 00:08 github-actions[bot]

This issue has been closed because it has not received any activity in the last 14 days since being marked as 'stale'

github-actions[bot] avatar Feb 11 '25 00:02 github-actions[bot]