Phani Kumar
Phani Kumar
@manmeetkaur did you get a chance to try the example DAG given above in your env?
@manmeetkaur please confirm if you got a chance to try the example DAG in your env.
To be included in release 1.9
Got access to dbt cloud today.
ETA for the example DAG which uses DbtCloudJobRunOperatorAsync - 18/8
This is now implemented as part of the unit test for the serialization method for each of the async trigger. Refer to a sample implementationn below for the snowflake trigger....
Getting the below error when I disable XCOM pickling, and execute my DAG using the latest code base on main branch. I did enable the custom xcom backend with `AIRFLOW__CORE__XCOM_BACKEND:...
Tested this again after https://github.com/astronomer/astro-sdk/pull/1060 is merged. **Scenario**: **Enabled pickling, used astro-sdk-python==1.1.1** **Task** ``` load_table_with_data = aql.load_file( input_file=File(path=f"{S3_BUCKET_NAME_SOURCE}/crxbank1.csv", filetype=FileType.CSV, conn_id=AWS_CONN_ID), task_id="load_csv_data_bank1", # output_table=crx_data_table, outlets=[Dataset("snowflake://crx_data_bank1")], do_xcom_push=True ) ``` **Result** ``` [2022-10-14,...
> **Scenario**: > > * Install SDK 1.1.1, run `example_dag` with `load_file` -- and it should succeed (with XCom pickling) > * Upgrade to the code in the main branch,...
Closing this as the upgrade scenarios have been tested