astronomer-cosmos
astronomer-cosmos copied to clipboard
Support for EKS operator
Description
We are using MWAA in combination with EKS so that all our dags in airflow are running in our EKS. We would like to use the same setup with cosmos.
What changes?
- New EKSOperator classes (inheriting from KubernetesOperators) - Based on the original EksOperator
- Tests
- Adjusted documentation
Related Issue(s)
Breaking Change?
No - only an additional feature
Checklist
- [x] I have made corresponding changes to the documentation (if required)
- [x] I have added tests that prove my fix is effective or that my feature works
Deploy Preview for sunny-pastelito-5ecb04 ready!
Name | Link |
---|---|
Latest commit | 42fa4c4f2d018ba6b45702d317d116db54febc52 |
Latest deploy log | https://app.netlify.com/sites/sunny-pastelito-5ecb04/deploys/664de8fe8711f20008ea8fa5 |
Deploy Preview | https://deploy-preview-944--sunny-pastelito-5ecb04.netlify.app |
Preview on mobile | Toggle QR Code...Use your smartphone camera to open QR code link. |
To edit notification comments on pull requests, go to your Netlify site configuration.
Thanks for the contribution, @VolkerSchiewe ! Please, could you address the tests currently failing?
Hi @tatiana thanks for getting back to me! I was already working on it, but I ran into this issue: https://github.com/apache/airflow/issues/39103
Seems to be affecting the latest version of the amazon airflow provider. I already tried with pinning an older version of xmlsec1
with no success :/
Hi @VolkerSchiewe , just a head's up: we're wrapping up the Cosmos 1.4 release and either myself or @pankajkoti will support you on this next week!
Looking in to the failing tests I'm not sure where the problems are coming from 🤔
Unit tests: Seem to be mostly fine. The 1-2 tests that are failing might just be a version incompatibility. Maybe we can exclude this somehow in the compatibility matrix
Integration tests:
I think the tests that I added should not be executed in integration, but I think the imports are still executed and failing because I'm missing the amazon packages in the integration environment. Should I add a try ... except
around the import to not import the package if it is not installed? How do you handle those cases normally?
Also let me know if the test coverage is enough. I tried to cover the parts that are special for the eks operator, but didn't want to copy everything from the KubernetesOperator since it's tested there already.
It appears that earlier versions of Flask are not compatible with newer versions of Jinja2; and hence we have the failure in our tests
These are the constraints for Airflow 2.3 which suggests Flask 1.1.2 and Jinja2 3.0.3, however dbt-core is upgrading it to jinja2-3.1.4 (Jinja2<4,>=3.1.3 (from dbt-core)) which is not supported on Flask 1.1.2 that comes with Airflow.
Hey there, the only failing parts are the tests with Airflow 2.3 and python 3.9. For some reason the install step fails with
pip._vendor.resolvelib.resolvers.ResolutionTooDeep: 200000
when installing the test requirements . I don't really understand why and how to solve this. Is this problem you ran into earlier already?
Fixing boto3 in the runs with airflow 2.3 solved the problem 🚀
Codecov Report
All modified and coverable lines are covered by tests :white_check_mark:
Project coverage is 95.72%. Comparing base (
007325a
) to head (42fa4c4
).
Additional details and impacted files
@@ Coverage Diff @@
## main #944 +/- ##
==========================================
+ Coverage 95.67% 95.72% +0.05%
==========================================
Files 59 60 +1
Lines 2890 2926 +36
==========================================
+ Hits 2765 2801 +36
Misses 125 125
:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.
Would it be possible to also paste a snapshot of a successful DAG run using AWS EKS operator?
Hope this is what you mean @pankajkoti (the operator naming is still before the renaming though)