Yang Xiufeng
Yang Xiufeng
@BohuTANG I tried, but without a file to infer, there is no schema for the output. I will try again to find some workaround.
my new sf account can not start warehouse, I will try later, some problems need resolving: - In principally, aside from `select *`, even a simple query like `select a`...
`max_file_size` is not guaranteed, so is [snowflake](https://docs.snowflake.com/en/sql-reference/sql/copy-into-location) 1. we need parallel processing to speed up. And we need to avoid creating files that are too small due to parallel processing....
@soyeric128 no need for that I think. In practice, It's not cost-effective to have files that are too small, no one would realy do this expect for testing. set a...
fixed by https://github.com/datafuselabs/databend/pull/15596
done https://github.com/datafuselabs/databend/pull/14420
use ` (type = TSV compression=gzip)` like in copy ``` curl -XPUT 'http://root:@127.0.0.1:8000/v1/streaming_load' -H 'insert_sql: insert into hackernews_1m FILE_FORMAT = (type = TSV compression=gzip) ' -F 'upload=@"./hacknernews_1m.csv.gz"' {"id":"ed17ccdc-2ea2-4fac-9ff9-97b0f61fd487","state":"SUCCESS","stats":{"rows":1000000,"bytes":130618406},"error":null,"files":["hacknernews_1m.csv.gz"]}% ```
default compression is none not auto
stats is right on my mac. the `bytes` you get is diff too, may be you have a diff hacknernews_1m.csv? ``` (venv) ➜ test git:(stage2) ls -lh hacknernews_1m.csv* -rw-r--r-- 1...
- old server: user get error when exec `Begin` - old client + new server: `Begin`/ `commit` / `rollback` may success, but txn is not working, - client may need...