Hal Ali
Hal Ali
@smomni yes, i noticed that as well & had made a note of it at the end of my pull request's message >limitation with athena execution engine version 1 information.columns...
This would be be nice. let me know if i can help
another option would be to specify batch_size as `auto`, where the client can construct the biggest batch with the constraints that * it does not exceed _maximum rows_ (i think...
@jon-wobken I have a [working solution for the `auto` solution](https://github.com/haleemur/simple-salesforce/tree/feat/bulk-auto-batch-size) which I described above. However, two things I noticed on after reading the [salesforce bulk api limits documentation.](https://developer.salesforce.com/docs/atlas.en-us.api_asynch.meta/api_asynch/bulk_common_limits.htm) 1. the...
@thomasgravyty looking at the repo HEAD, I feel like the bug has been resolved. Should we close this issue & #564 ?
Thanks @edgarrmondragon, I have a slight preference towards option 1 as its the simplest code change, and will suffice. If the documentation is improved according to #3116, maybe we should...
On further thought, maybe meltano should test whether `namespace` is present under the scenarios where it needs to be present (is a custom plugin), and emit specific instructions to the...
Hmm, maybe I mis-understood the flow of data as well & as a result my idea is not very clearly expressed. Please let me know if I'm incorrect in the...
@edgarrmondragon thank you for writing the proposal. I'm curious to hear how you feel about this alternative: ``` stream_maps: # Apply these transforms to the stream called 'customers' customers: email:...