Ankit Chaurasia
Ankit Chaurasia
I will help @pankajastro with this task
@kaxil I think readthedocs has this as part of `Simple ETL workflow`.
Yes, `spark-submit` works for the deploy-mode `local`. So `SparkSubmitOperator` example dag works with local spark setup. For `SparkSqlOperator` i am setting up the hive along spark to create a table...
`SparkSqlOperator` requires the `spark-sql` to work when it's installed along in the docker. Right now, for deployment mode as `local`, the `SparkSqlOperator` fails as `spark-sql` fails with the following error:...
It doesn't make sense to have the config both on github and GCS.
> Add context to the task in test yup https://github.com/astronomer/astro-sdk/pull/1068/commits/224e4de4d8fed82f8b3f9f38426c205d9c5efa57
@tatiana should we prioritise this task or keep it in icebox?
We need to credentials for astro-sdk instead of astronomer-provider @rajaths010494
@rajaths010494 to explore deploy actions and provide an estimate.
@thesuperzapper please take a look