Frank Taylor
Frank Taylor
+1 on getting this merged
Hey @FStephenQuaratiello would you mind providing the GraphQL query and the BigQuery SQL query that you ran so we can investigate further?
@gtseres-workable thanks for this suggestion. If this can meaningfully reduce costs, it should be implemented no question. Can you provide any estimate on the cost difference? I think the cost...
The major challenge here: memory and time limits for Cloud Functions. In order to push every log into stackdriver, this function would need to decompress the batches of log files...
Hey @thecodeassassin - thanks for putting this in :) The default timestamp format for Logpush is actually RFC3339 (e.g. `2019-10-12T07:20:50.52Z`):  ... which is of type `TIMESTAMP`...
Thanks @Anticmos. Can you please create a default string value for the `CRON_JOB_NAME` in `deploy.sh`, for example: ```bash # deploy.sh CRON_JOB_NAME='cf_logs_cron' ```
Yes -- I'm still here :) Was on leave for a while Will merge these changes and others by end of this week!
Hey @igorwwwwwwwwwwwwwwwwwwww thanks for taking a stab at this. In the past, creating ingestion-time partitioning for logs inserted via a load job has been non-trivial. The solution in your PR...
Hey @gpacuilla thanks for the contribution! Can you please remove all of the semi-colons added by your linter and make a new commit?
On my initial review, the PR you've provided seems too specific to your use case, and generates additional complexity for the majority of deployments. That said: reading your PR comments,...