spark-redshift
spark-redshift copied to clipboard
Add support for writetimeout
Added support to specify timeout between writing temp files in S3 and calling upon Redshift to COPY. This fixes the issue caused by the eventual consistency of S3 S3ServiceException:The specified key does not exist.,Status 404,Error NoSuchKey when trying to write a DataFrame to Redshift. The issue is described in more details here: https://github.com/databricks/spark-redshift/issues/136
any update about this? when this fix will be merge in the branch databricks:master?
I also wonder, it it will be merged, because it is quite often issue in Amazon Glue.
@JoshRosen please take a look.