error "google.api_core.exceptions.InvalidArgument: 400 Cannot parse as CloudRegion." while try to create BigLake table with google-cloud-bigquery-biglake
I already have Biglake table named "config" in the same region that created with stored procedure + big spark via Bigquery "projects/dudledood-sql-project-1/locations/us/catalogs/iceberg_catalog/database/iceberg_warehouse/tables/config"
but I can't create another with
create_table_request = bigquery_biglake_v1.CreateTableRequest(
parent="projects/{dudledood-sql-project-1}/locations/{us}/catalogs/{iceberg_catalog}/database/{iceberg_warehouse}", # projects/dudledood-sql-project-1/locations/us}/catalogs/iceberg_catalog/database/iceberg_warehouse
table_id=table_id,
)
# Make the request
create_table_response = self.biglake_client.create_table(request=create_table_request)
# Handle the response
print(create_table_response)
full error
[2024-04-26, 11:31:49 UTC] {spark_submit.py:571} INFO - Traceback (most recent call last):
[2024-04-26, 11:31:49 UTC] {spark_submit.py:571} INFO - File "/home/***/.local/lib/python3.11/site-packages/google/api_core/grpc_helpers.py", line 76, in error_remapped_callable
[2024-04-26, 11:31:49 UTC] {spark_submit.py:571} INFO - return callable_(*args, **kwargs)
[2024-04-26, 11:31:49 UTC] {spark_submit.py:571} INFO - ^^^^^^^^^^^^^^^^^^^^^^^^^^
[2024-04-26, 11:31:49 UTC] {spark_submit.py:571} INFO - File "/home/***/.local/lib/python3.11/site-packages/grpc/_channel.py", line 1176, in __call__
[2024-04-26, 11:31:49 UTC] {spark_submit.py:571} INFO - return _end_unary_response_blocking(state, call, False, None)
[2024-04-26, 11:31:49 UTC] {spark_submit.py:571} INFO - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[2024-04-26, 11:31:49 UTC] {spark_submit.py:571} INFO - File "/home/***/.local/lib/python3.11/site-packages/grpc/_channel.py", line 1005, in _end_unary_response_blocking
[2024-04-26, 11:31:49 UTC] {spark_submit.py:571} INFO - raise _InactiveRpcError(state) # pytype: disable=not-instantiable
[2024-04-26, 11:31:49 UTC] {spark_submit.py:571} INFO - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[2024-04-26, 11:31:49 UTC] {spark_submit.py:571} INFO - grpc._channel._InactiveRpcError: <_InactiveRpcError of RPC that terminated with:
[2024-04-26, 11:31:49 UTC] {spark_submit.py:571} INFO - status = StatusCode.INVALID_ARGUMENT
[2024-04-26, 11:31:49 UTC] {spark_submit.py:571} INFO - details = "Cannot parse as CloudRegion."
[2024-04-26, 11:31:49 UTC] {spark_submit.py:571} INFO - debug_error_string = "UNKNOWN:Error received from peer ipv4:142.250.199.10:443 {created_time:"2024-04-26T11:31:49.70895308+00:00", grpc_status:3, grpc_message:"Cannot parse as CloudRegion."}"
[2024-04-26, 11:31:49 UTC] {spark_submit.py:571} INFO - >
[2024-04-26, 11:31:49 UTC] {spark_submit.py:571} INFO -
[2024-04-26, 11:31:49 UTC] {spark_submit.py:571} INFO - The above exception was the direct cause of the following exception:
[2024-04-26, 11:31:49 UTC] {spark_submit.py:571} INFO -
[2024-04-26, 11:31:49 UTC] {spark_submit.py:571} INFO - Traceback (most recent call last):
[2024-04-26, 11:31:49 UTC] {spark_submit.py:571} INFO - File "/opt/***/dags/script/iceberg_insertion_sparkappp.py", line 28, in <module>
[2024-04-26, 11:31:49 UTC] {spark_submit.py:571} INFO - biglake_client.create_biglake_table(table_id=table_id)
[2024-04-26, 11:31:49 UTC] {spark_submit.py:571} INFO - File "/tmp/localPyFiles-2899bd6c-42be-48e5-b653-e6df2dad4414/client.py", line 164, in create_biglake_table
[2024-04-26, 11:31:49 UTC] {spark_submit.py:571} INFO - create_table_response = self.biglake_client.create_table(request=create_table_request)
[2024-04-26, 11:31:49 UTC] {spark_submit.py:571} INFO - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[2024-04-26, 11:31:49 UTC] {spark_submit.py:571} INFO - File "/home/***/.local/lib/python3.11/site-packages/google/cloud/bigquery_biglake_v1/services/metastore_service/client.py", line 1882, in create_table
[2024-04-26, 11:31:49 UTC] {spark_submit.py:571} INFO - response = rpc(
[2024-04-26, 11:31:49 UTC] {spark_submit.py:571} INFO - ^^^^
[2024-04-26, 11:31:49 UTC] {spark_submit.py:571} INFO - File "/home/***/.local/lib/python3.11/site-packages/google/api_core/gapic_v1/method.py", line 131, in __call__
[2024-04-26, 11:31:49 UTC] {spark_submit.py:571} INFO - return wrapped_func(*args, **kwargs)
[2024-04-26, 11:31:49 UTC] {spark_submit.py:571} INFO - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[2024-04-26, 11:31:49 UTC] {spark_submit.py:571} INFO - File "/home/***/.local/lib/python3.11/site-packages/google/api_core/grpc_helpers.py", line 78, in error_remapped_callable
[2024-04-26, 11:31:49 UTC] {spark_submit.py:571} INFO - raise exceptions.from_grpc_error(exc) from exc
[2024-04-26, 11:31:49 UTC] {spark_submit.py:571} INFO - google.api_core.exceptions.InvalidArgument: 400 Cannot parse as CloudRegion.
@tiltgod Can you try the above request on Bigquery API Explorer and see if it works?
If you see a similar error there, kindly file an issue with Bigquery. Otherwise, follow-up on this issue and we'll be happy to assist.
@tiltgod Can you try the above request on Bigquery API Explorer and see if it works?
If you see a similar error there, kindly file an issue with Bigquery. Otherwise, follow-up on this issue and we'll be happy to assist.
Yes, I still get the similar error
HTTP/1.1 400 content-type: application/json; charset=UTF-8 date: Wed, 01 May 2024 16:59:47 GMT {"error":{"code":400,"message":"Cannot parse {US} as CloudRegion.","errors":[{"message":"Cannot parse {US} as CloudRegion.","domain":"global","reason":"badRequest"}],"status":"INVALID_ARGUMENT"}}
Yes, I still get the similar error
Thanks for checking. This means it's an API-level issue rather than a client library issue. Could you seek support as per the Bigquery page? This repository is just for the client libraries, so having us in the middle would just slow things down.
Thanks, and good luck.