postgres-aws-s3 icon indicating copy to clipboard operation
postgres-aws-s3 copied to clipboard

aws_s3 postgres extension to import/export data from/to s3 (compatible with aws_s3 extension on AWS RDS)

Results 17 postgres-aws-s3 issues
Sort by recently updated
recently updated
newest added

I am trying to import s3 data from localstack, using: ```sql select aws_s3.table_import_from_s3( 'tablename', 'col1,col2', '(format csv, header true)', aws_commons.create_s3_uri('my-bucket','test.csv','us-west-2'), aws_commons.create_aws_credentials('none','none',''), 'http://localstack:4566'); ``` `none` is what I use for all...

Some tools do not properly set the ContentEncoding metadata on write in S3. This update adds functionality to specify a custom encoding (e.g. "gzip") as input to the `table_import_from_s3` function.

On AWS the return type for `table_import_from_s3` is text while the one here returns an int

Hi have a question, don't know could be something i missed but anyway like to use postgres **copy command** to export/import data to RDS from outside RDS so aws-s3 extension...

![Screen Shot 2021-07-12 at 10 33 00 AM](https://user-images.githubusercontent.com/1624530/126048282-c0920bc6-fb8f-4c5e-ba7d-2de4ee7073cf.png) Some tools like pyspark dataframe s3 write does not set AWS system-defined metadata when writing and specifying "gzip" compression. Postgres s3_import extension...

Is there any example of using this in unit tests with a moto mock_s3 context or a moto server endpoint?

Hello! I'm just trying to use your aws_s3 extention. When runing this query SELECT * FROM aws_s3.query_export_to_s3( query => ' select json_agg(post_main_approved) from asmo.post_main_approved limit 10 ; ', bucket =>...