Tim Sweña (Swast)

Results 302 comments of Tim Sweña (Swast)

e2e failures appear unrelated ``` FAILED tests/system/large/functions/test_remote_function.py::test_remote_function_max_instances[set-None] FAILED tests/system/large/functions/test_remote_function.py::test_remote_function_max_instances[no-set] FAILED tests/system/large/streaming/test_bigtable.py::test_streaming_df_to_bigtable FAILED tests/system/large/streaming/test_pubsub.py::test_streaming_df_to_pubsub ```

Note: Update the integrations notebook (see: https://github.com/googleapis/python-bigquery-dataframes/pull/835) once this issue is resolved.

Note: this differs from https://github.com/googleapis/python-bigquery-dataframes/pull/117 in that it would be possible to set the user-agent temporarily, such as if a library accepts a bigframes object but doesn't initialize the bigframes...

For going from BigQuery DataFrames to polars, I'm adding a `to_arrow` method in https://github.com/googleapis/python-bigquery-dataframes/pull/807 as well as an example for how to create a polars DataFrame from the results.

For uploading to BigQuery, I have updated the polars docs to indicate how to get BigQuery to correctly handle list types https://github.com/pola-rs/polars/pull/20292 I think that `read_polars` and `to_polars` methods would...

I just mailed https://github.com/googleapis/python-bigquery-dataframes/pull/1855 with `bpd.read_arrow(pyarrow.Table)` to round out the other side of this conversion. Technically I think this was possible before by going through the DataFrame constructor, but that...

Failures look like real failures. ``` if not 200 raise exceptions.from_http_response(response) E google.api_core.exceptions.BadRequest: 400 GET https://bigquery.googleapis.com/bigquery/v2/projects/python-docs-samples-tests/queries/32f42306-e95f-48bc-a2fb-56761aec5476?maxResults=0&location=US&prettyPrint=false: Invalid field name "_TABLE_SUFFIX". Field names are not allowed to start with the (case-insensitive)...

Added `do not merge`. Need to make sure this is compatible with `to_gbq()` and `cached()`.

From the notebook tests: ``` File /[tmpfs/src/github/python-bigquery-dataframes/bigframes/core/nodes.py:711](https://cs.corp.google.com/piper///depot/google3/tmpfs/src/github/python-bigquery-dataframes/bigframes/core/nodes.py?l=711), in GbqTable.from_table(table, columns) 708 @staticmethod 709 def from_table(table: bq.Table, columns: Sequence[str] = ()) -> GbqTable: 710 # Subsetting fields with columns can reduce...

Tested the failing samples tests locally. I think my latest commits solve the issue of not being able to materialize to `_TABLE_SUFFIX` as the column name.