Enhancing breeze commands with PACKAGE_LIST env variable
Instead of specifying the provider ID for each command we run during releasing providers, a variable is set, allowing for easy reuse throughout the process. This reduces redundancy and makes it easier when releasing subset of providers.
Defining the env:
(env-airflow) ➜ airflow git:(enhanceBreezeCommandWithEnv) ✗ export PACKAGE_LIST=mongo,amazon
Running prepare-provider-packages:
(env-airflow) ➜ airflow git:(enhanceBreezeCommandWithEnv) ✗ breeze release-management prepare-provider-packages
Good version of Docker: 24.0.2.
Good version of docker-compose: 2.23.3
Executable permissions on entrypoints are OK
Populating provider list from PACKAGE_LIST env as mongo,amazon
Fetching full history and tags from remote.
This might override your local tags!
Running build-docs
(env-airflow) ➜ airflow git:(enhanceBreezeCommandWithEnv) ✗ breeze build-docs
Good version of Docker: 24.0.2.
Good version of docker-compose: 2.23.3
Executable permissions on entrypoints are OK
The following important files are modified in /Users/adesai/Documents/OSS/airflow since last time image was built:
* pyproject.toml
Likely CI image needs rebuild
Do you want to build the image (this works best when you have good connection and can take usually from 20 seconds to few minutes depending how old your image is)?
Press y/N/q. Auto-select n in 10 seconds (add `--answer n` to avoid delay next time): n
The CI image for Python version 3.8 may be outdated
Please run at the earliest convenience:
breeze ci-image build --python 3.8
Populating provider list from PACKAGE_LIST env as mongo,amazon
Using airflow version from current sources
Leaving default pydantic v2
#################### Available packages ####################
- apache-airflow
- apache-airflow-providers
- apache-airflow-providers-airbyte
- apache-airflow-providers-alibaba
- apache-airflow-providers-amazon
- apache-airflow-providers-apache-beam
- apache-airflow-providers-apache-cassandra
- apache-airflow-providers-apache-drill
- apache-airflow-providers-apache-druid
- apache-airflow-providers-apache-flink
- apache-airflow-providers-apache-hdfs
- apache-airflow-providers-apache-hive
- apache-airflow-providers-apache-impala
- apache-airflow-providers-apache-kafka
- apache-airflow-providers-apache-kylin
- apache-airflow-providers-apache-livy
- apache-airflow-providers-apache-pig
- apache-airflow-providers-apache-pinot
- apache-airflow-providers-apache-spark
- apache-airflow-providers-apprise
- apache-airflow-providers-arangodb
- apache-airflow-providers-asana
- apache-airflow-providers-atlassian-jira
- apache-airflow-providers-celery
- apache-airflow-providers-cloudant
- apache-airflow-providers-cncf-kubernetes
- apache-airflow-providers-cohere
- apache-airflow-providers-common-io
- apache-airflow-providers-common-sql
- apache-airflow-providers-databricks
- apache-airflow-providers-datadog
- apache-airflow-providers-dbt-cloud
- apache-airflow-providers-dingding
- apache-airflow-providers-discord
- apache-airflow-providers-docker
- apache-airflow-providers-elasticsearch
- apache-airflow-providers-exasol
- apache-airflow-providers-fab
- apache-airflow-providers-facebook
- apache-airflow-providers-ftp
- apache-airflow-providers-github
- apache-airflow-providers-google
- apache-airflow-providers-grpc
- apache-airflow-providers-hashicorp
- apache-airflow-providers-http
- apache-airflow-providers-imap
- apache-airflow-providers-influxdb
- apache-airflow-providers-jdbc
- apache-airflow-providers-jenkins
- apache-airflow-providers-microsoft-azure
- apache-airflow-providers-microsoft-mssql
- apache-airflow-providers-microsoft-psrp
- apache-airflow-providers-microsoft-winrm
- apache-airflow-providers-mongo
- apache-airflow-providers-mysql
- apache-airflow-providers-neo4j
- apache-airflow-providers-odbc
- apache-airflow-providers-openai
- apache-airflow-providers-openfaas
- apache-airflow-providers-openlineage
- apache-airflow-providers-opensearch
- apache-airflow-providers-opsgenie
- apache-airflow-providers-oracle
- apache-airflow-providers-pagerduty
- apache-airflow-providers-papermill
- apache-airflow-providers-pgvector
- apache-airflow-providers-pinecone
- apache-airflow-providers-postgres
- apache-airflow-providers-presto
- apache-airflow-providers-qdrant
- apache-airflow-providers-redis
- apache-airflow-providers-salesforce
- apache-airflow-providers-samba
- apache-airflow-providers-segment
- apache-airflow-providers-sendgrid
- apache-airflow-providers-sftp
- apache-airflow-providers-singularity
- apache-airflow-providers-slack
- apache-airflow-providers-smtp
- apache-airflow-providers-snowflake
- apache-airflow-providers-sqlite
- apache-airflow-providers-ssh
- apache-airflow-providers-tableau
- apache-airflow-providers-tabular
- apache-airflow-providers-telegram
- apache-airflow-providers-teradata
- apache-airflow-providers-trino
- apache-airflow-providers-vertica
- apache-airflow-providers-weaviate
- apache-airflow-providers-yandex
- apache-airflow-providers-zendesk
- docker-stack
- helm-chart
Current package filters:
['apache-airflow-providers-mongo', 'apache-airflow-providers-amazon']
Running publish-docs
(env-airflow) ➜ airflow git:(enhanceBreezeCommandWithEnv) ✗ breeze release-management publish-docs
Populating provider list from PACKAGE_LIST env as mongo,amazon
Publishing docs for 2 package(s)
- apache-airflow-providers-amazon
- apache-airflow-providers-mongo
Doesbnt make sense to add for add-back-references as it is idempotent.
^ Add meaningful description above
Read the Pull Request Guidelines for more information.
In case of fundamental code changes, an Airflow Improvement Proposal (AIP) is needed.
In case of a new dependency, check compliance with the ASF 3rd Party License Policy.
In case of backwards incompatible changes please leave a note in a newsfragment file, named {pr_number}.significant.rst or {issue_number}.significant.rst, in newsfragments.
cc @eladkal added this support
@eladkal can you take a look at this PR when you have some time?
@potiuk handled the review comments. An example of the error case when both the environment variable as well as individual arguments are passed to the command. I am publishing a warning like so:
➜ airflow git:(enhanceBreezeCommandWithEnv) ✗ export PACKAGE_LIST=mongo,amazon,apache.beam
➜ airflow git:(enhanceBreezeCommandWithEnv) ✗
➜ airflow git:(enhanceBreezeCommandWithEnv) ✗ breeze build-docs sftp
➜ airflow git:(enhanceBreezeCommandWithEnv) ✗ breeze build-docs sftp
Good version of Docker: 24.0.2.
Good version of docker-compose: 2.23.3
Executable permissions on entrypoints are OK
The following important files are modified in /Users/adesai/Documents/OSS/airflow since last time image was built:
* pyproject.toml
Likely CI image needs rebuild
Do you want to build the image (this works best when you have good connection and can take usually from 20 seconds to few minutes depending how old your image is)?
Press y/N/q. Auto-select n in 10 seconds (add `--answer n` to avoid delay next time): n
The CI image for Python version 3.8 may be outdated
Please run at the earliest convenience:
breeze ci-image build --python 3.8
Populating provider list from PACKAGE_LIST env as mongo,amazon,apache.beam
Both package arguments and --package-list / PACKAGE_LIST passed. Overriding to ('mongo', 'amazon', 'apache.beam')
^C
Aborted.
Oops, image needed to see color coding
@potiuk @eladkal Can you take a look when you have some time? I will fix the static checks
Yeah. And conflicts :)
@potiuk @eladkal just fixed the build docs and failing checks and also rebased. Should be green now hopefully!