spark-redshift icon indicating copy to clipboard operation
spark-redshift copied to clipboard

Add support for writetimeout

Open samsonov90 opened this issue 8 years ago • 3 comments

Added support to specify timeout between writing temp files in S3 and calling upon Redshift to COPY. This fixes the issue caused by the eventual consistency of S3 S3ServiceException:The specified key does not exist.,Status 404,Error NoSuchKey when trying to write a DataFrame to Redshift. The issue is described in more details here: https://github.com/databricks/spark-redshift/issues/136

samsonov90 avatar Dec 01 '17 21:12 samsonov90

any update about this? when this fix will be merge in the branch databricks:master?

sylvinho81 avatar Sep 07 '18 15:09 sylvinho81

I also wonder, it it will be merged, because it is quite often issue in Amazon Glue.

sphinks avatar Jan 14 '19 13:01 sphinks

@JoshRosen please take a look.

samsonov90 avatar Jan 14 '19 14:01 samsonov90