Tim Sweña (Swast)

Results 302 comments of Tim Sweña (Swast)

This seems to work: ```rust // TODO(tswast): Handle the connection error gracefully. let channel = GoogleEnvironment::init_google_services_channel("https://bigquerystorage.googleapis.com") .await .unwrap(); let read_client = // Maximum row size in BigQuery is 100 MB,...

I did figure out a way to pass a client object somewhat reasonably, ```rust async fn read_stream( read_client: Arc, schema: Arc, stream_name: String, tx: Arc, ) { ``` Except this...

This is a tough one. I think it does belong here, as the SQLAlchemy connector needs to continue to generate SQL that can be understood by the `google.cloud.bigquery.dbapi` modules. I...

Note: when doing so, make sure to update internal concord pipelines to accept either `-` or `/` as the version separator.

Oh, and re: pandas, it looks like we're about there too. We require `pandas>=0.23.2`, which means if we did a release today I think we could bump the minimum version...

> Is there a compelling reason for catching and rethrowing? I'm having trouble thinking of one.

Note: in addition to tackling this here for pandas-gbq, we should update the defaults in pydata-google-auth for folks using that library directly. https://github.com/pydata/pydata-google-auth/issues/63

Would love a contribution. For clustering columns in particular, let's mimic the interface that bigframes.pandas.DataFrame.to_gbq came up with: ``` clustering_columns: typing.Union[ pandas.core.indexes.base.Index, typing.Iterable[typing.Hashable] ] = (), ``` > clustering_columns Union[pd.Index,...

Per https://github.com/googleapis/google-cloud-python/issues/14482, we should only do this if `dtype_backend="numpy_nullable"`.

Nothing blocking it, now that the API is GA. We probably want to use the `create_bqstorage_client` parameter in the [to_dataframe](https://googleapis.dev/python/bigquery/latest/generated/google.cloud.bigquery.job.QueryJob.html#google.cloud.bigquery.job.QueryJob.to_dataframe) method for future compatibility.