databricks-sql-go icon indicating copy to clipboard operation
databricks-sql-go copied to clipboard

How to make transfer data faster while using databricks-sql-go

Open calebeaires opened this issue 1 year ago • 4 comments

I am using the driver to run a data migration. When dealing with a table of 24 million rows and 9 columns, the performance is excellent when I fetch 10 thousand rows. When I increase the fetch size to 100 thousand rows, the transfer speed is still good. However, when fetching 1 million rows or more, the data transfer becomes very slow. A quick test made me think that the driver tries to fetch all the data in a single batch. Is there a way to improve this process?

This is the connection string


"token:xxx@host:443$xxx-path?catalog=sample&database=big_table&useCloudFetch=true&maxRows=10000"

I've tried to use this two settings, data transfer is still slow to big tables.

  • useCloudFetch=true
  • maxRows=10000

In forums users suggest change spark.driver.maxResultSize ?

calebeaires avatar Jul 11 '24 20:07 calebeaires

@calebeaires considering you enabled CloudFetch - most likely your assumption is correct, the driver actually tries to download all the data in memory. That's what we discovered recently, and are trying to fix right now - databricks/databricks-sql-go#234 Try to disable CloudFetch, this helped some other users. And then you can play with maxRows to see if it has any effect in your case

kravets-levko avatar Jul 12 '24 07:07 kravets-levko

@kravets-levko I've changed the settings to useCloudFetch=false, still getting errors. Sometimes this erros come out:


execution error: failed to execute query: unexpected operation state ERROR_STATE: Total size of serialized results of 382 tasks (4.0 GiB) is bigger than spark.driver.maxResultSize 4.0 GiB. Code: 108

Topic: Spark Driver maxResultSize exceed 4.0 GiB

Hope we find a solution!

calebeaires avatar Jul 16 '24 18:07 calebeaires

@calebeaires for this particular error please reach out an administrator of your workspace, or Databricks support. This one is related to your workspace configuration, and that's not something that can be handled in library

kravets-levko avatar Jul 16 '24 19:07 kravets-levko

Thank you so much. Please, help me elaborate the issue to the administrator. Is there some config the admin can define concerning max memory result size?

calebeaires avatar Jul 18 '24 20:07 calebeaires