Fix treatment of "#" in S3Hook.parse_s3_url()
Apache Airflow version
2.8.4 in my environment, but the issue is still present in main
What happened
A client submitted an S3 file to my workflow with an octothorpe in the filename, essentially s3://my-bucket/path/to/key/email campaign - PO# 123456_REPORT.csv. When my Airflow DAG tried to parse this URL, part of the filename was lost:
>>> s3_key = 's3://my-bucket/path/to/key/email campaign - PO# 123456_REPORT.csv'
>>> S3Hook.parse_s3_url(s3_key)
('my-bucket', 'path/to/key/email campaign - PO')
What you think should happen instead
The key should not be truncated. The result of the above example should be ('my-bucket', 'PO# 123456_REPORT.csv')
How to reproduce:
Call S3Hook.parse_s3_url() with a # character in the S3 URL. Everything after the # is lost, because urllib.parse.urlsplit() is current called with the default option allow_fragments=True.
This PR passes allow_fragments=False to urlsplit to prevent this error. As far as I can tell, there are no valid cases of # in an S3 key being treated as a fragment, and no existing GitHub issue to fix this.
Operating System
Ubuntu 22.04
Versions of Apache Airflow Providers
| Provider | Version |
|---|---|
| apache-airflow-providers-amazon | 8.20.0 |
| apache-airflow-providers-celery | 3.6.2 |
| apache-airflow-providers-common-io | 1.3.1 |
| apache-airflow-providers-common-sql | 1.12.0 |
| apache-airflow-providers-ftp | 3.8.0 |
| apache-airflow-providers-http | 4.10.1 |
| apache-airflow-providers-imap | 3.5.0 |
| apache-airflow-providers-postgres | 5.10.2 |
| apache-airflow-providers-sendgrid | 3.4.0 |
| apache-airflow-providers-sftp | 4.9.1 |
| apache-airflow-providers-smtp | 1.6.1 |
| apache-airflow-providers-sqlite | 3.7.1 |
| apache-airflow-providers-ssh | 3.10.1 |
Deployment
This may be reproduced without deploying
Are you willing to submit PR?
- [x] Yes I am willing to submit a PR!
Code of Conduct
- [x] I agree to follow this project's Code of Conduct
Congratulations on your first Pull Request and welcome to the Apache Airflow community! If you have any issues or are unsure about any anything please check our Contributors' Guide (https://github.com/apache/airflow/blob/main/contributing-docs/README.rst) Here are some useful points:
- Pay attention to the quality of your code (ruff, mypy and type annotations). Our pre-commits will help you with that.
- In case of a new feature add useful documentation (in docstrings or in
docs/directory). Adding a new operator? Check this short guide Consider adding an example DAG that shows how users should use it. - Consider using Breeze environment for testing locally, it's a heavy docker but it ships with a working Airflow and a lot of integrations.
- Be patient and persistent. It might take some time to get a review or get the final approval from Committers.
- Please follow ASF Code of Conduct for all communication including (but not limited to) comments on Pull Requests, Mailing list and Slack.
- Be sure to read the Airflow Coding style.
- Always keep your Pull Requests rebased, otherwise your build might fail due to changes not related to your commits. Apache Airflow is a community-driven project and together we are making it better 🚀. In case of doubts contact the developers at: Mailing List: [email protected] Slack: https://s.apache.org/airflow-slack
Static checks are failing, running the pre-commit should auto resolve them
static tests are failing
Awesome work, congrats on your first merged pull request! You are invited to check our Issue Tracker for additional contributions.