fluent-plugin-bigquery
fluent-plugin-bigquery copied to clipboard
## Environments - fluentd version: fluent/fluentd:v1.1.0 - plugin version: fluent-plugin-bigquery-1.2.0 ## Expected Behavior The dataset is created in `asia-northeast1`, I cannot find a way to specify the `location` so I...
periodically I do catch following error: ``` 2017-07-29 11:46:43 +0300 [warn]: #0 plugin/output.rb:1096:block in update_retry_state: failed to flush the buffer. retry_time=0 next_retry_seconds=2017-07-29 11:46:43 +0300 chunk="55570d4efaf3ca91e7b235dffef577db" error_class=MultiJson::ParseError error="lexical error: invalid bytes...
## Environments - fluentd version: 14 - plugin version: 1.0.0 I'm considering making wait_interval configurable in the version of fluent-plugin-bigquery that I am using. The current wait time for jobs...
## Environments - fluentd version: 0.14.21 - plugin version: 1.0.0 I'm working on refactoring fluentd configurations at work. And I thought it would be nice if placeholders can be used...
Seems to be an overall ruby problem on windows with certificate validation, just wondering is there an easy way to turn off verification? After looking around seems that it is...
BigQuery can dynamic add column, is it a good idea to add this handling in plugin? Avoid check the table schema every time, may add this in error handling. code...
We love the plugin, and we primary use for event shipping. We use the `ignore_unknown_values` config option but we would like to convert all ignored values to a json string...
## Environments - fluentd version: v1.15.0-1.0 - plugin version: 3.0.0 ## Configuration ``` @type bigquery_load path /mnt/audit-logs-bigquery.*.buffer flush_interval 10s total_limit_size 1g auth_method json_key json_key "#{ENV['FLUENT_BIGQUERY_JSON_KEY']}" project "#{ENV['FLUENT_BIGQUERY_PROJECT']}" dataset "#{ENV['FLUENT_BIGQUERY_DATASET']}" table...
Also: * fix a few typos and some wording issues, * replace links to a deprecated link shortener with full ones.
Currently, it is not a good practice to create a table in BigQuery each time, and it is better to create a Partitioned Table. Therefore, the value of creating tables...