Tim Sweña (Swast)
Tim Sweña (Swast)
This is long overdue. Bumping the priority on this.
It's free to read from anonymous tables created by a query.
When we do this, we may want to install the [now officially supported pandas stubs](https://groups.google.com/g/pydata/c/WgrnysX6uV0/m/2065n6w2AQAJ) to ensure pandas-gbq type checks.
There's one failing system test after I updated them to look for the correct dtype on pandas 2.1.0 and above. ``` pytest 'tests/system/test_to_gbq.py::test_dataframe_round_trip_with_table_schema[load_csv-issue365-extreme-datetimes]' ``` I took a look at the...
I think you are correct. Basically, we are now calling `query_and_wait`, which might not create a destination table that we can read from with the BQ Storage API. Such a...
It sounds like we need three values for `use_bqstorage_api`: * `True`: always use (use `query`) * `False`: never use (currently works, I believe) (use `query_and_wait` and disable bq storage client...
Hi @Mikerah, I recommend you try out the `use_bqstorage_api=True` option, which uses the BigQuery Storage API to download results (currently in Beta). Creating a destination table shouldn't be required. See:...
Are you using legacy SQL? https://cloud.google.com/bigquery/docs/writing-results#limitations The workarounds I suggest are: - Use standard SQL dialect (recommended) - OR create a dataset with a default table expiration of a few...
@jlynchMicron I think you are encountering a different error from the one described in https://github.com/googleapis/python-bigquery/issues/1252 I'm not seeing a relevant limit listed here https://cloud.google.com/bigquery/quotas#query_jobs but you might be returning too...
@jlynchMicron I don't know of a good way to know the result size without actually running the query. In the `google-cloud-bigquery` library, you can run the query and wait for...