Yannick Misteli
Yannick Misteli
Just wanted to plus one here since my team is leveraging argo but other teams in the company are using airflow. To have a common framework would be really great...
@thesuperzapper do you have a working version of Rstudio Dockerfile for Kubeflow that you can share?
okay great. To give a bit more context, the ultimate goal would be to do some pre-processing in AWS lambda (s3 triggered)...hence it's crucial to only load a certain level...
S3 APIs support the HTTP Range: header (see [RFC 2616](https://www.w3.org/Protocols/rfc2616/rfc2616-sec14.html#sec14.35)) which take a byte range argument. Sample S3 call: aws s3api get-object --bucket my_bucket --key path/to/my/file/file1.gz file1.gz --range bytes=1000-2000
I fully agree with @marrrcin here - could you elaborate your concerns? @chensun @ji-yaqi
This is very relevant for us as well!
Thanks 🙏 yes, that’s exactly what I have done 😊 wanted just raise this because it could maybe get more gracefully be handled
Same setup for us (IdP) and it’s a high priority to switch to SSO for both dbt core and dbt cloud
Thanks for reopening!