libcloud
libcloud copied to clipboard
Add AWS region me-central-1
Add AWS region me-central-1
Description
We need support for opt in AWS region me-central-1.
Resources:
endpoint: https://docs.aws.amazon.com/general/latest/gr/ec2-service.htmlcountryandsignature_version: https://docs.aws.amazon.com/general/latest/gr/s3.html
Status
- Done, ready for review.
We tried to follow the documentation (see [1]) to update theses files:
contrib/scrape-ec2-prices.pycontrib/scrape-ec2-sizes.py
Sadly when running tox -e scrape-ec2-sizes,scrape-ec2-prices, scrape-ec2-sizes failed:
root@a2eea4b5cfec:/work/libcloud# tox -e scrape-ec2-sizes,scrape-ec2-prices
scrape-ec2-sizes: commands[0]> bash -c 'echo "Scrapping EC2 sizes, this may take up to 10 minutes or more since the actual JSON data we download and scrape is very large"'
Scrapping EC2 sizes, this may take up to 10 minutes or more since the actual JSON data we download and scrape is very large
scrape-ec2-sizes: commands[1]> bash -c 'python contrib/scrape-ec2-sizes.py'
Scraping size data, this may take up to 10-15 minutes...
100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 6.00G/6.00G [02:12<00:00, 45.4MiB/s]
scrape-ec2-sizes: exit -9 (652.54 seconds) /work/libcloud> bash -c 'python contrib/scrape-ec2-sizes.py' pid=21654
scrape-ec2-sizes: FAIL ✖ in 10 minutes 52.65 seconds
scrape-ec2-prices: commands[0]> python contrib/scrape-ec2-prices.py
Scraping EC2 pricing data (if this runs for the first time it has to download a 3GB file, depending on your bandwith it might take a while)....
Using data from existing cached file /tmp/ec.json (mtime=2024-07-22 21:09:14 UTC)
Starting to parse pricing data, this could take up to 15 minutes...
297107179it [09:43, 508892.55it/s]
Using data from existing cached file /tmp/ec.json (mtime=2024-07-22 21:09:14 UTC)
Starting to parse pricing data, this could take up to 15 minutes...
101166581it [05:49, 289104.63it/s]
Unexpected OS Ubuntu Pro
Unexpected OS Ubuntu Pro
...
scrape-ec2-sizes: FAIL code -9 (652.65=setup[0.09]+cmd[0.01,652.54] seconds)
scrape-ec2-prices: OK (942.42=setup[0.12]+cmd[942.30] seconds)
evaluation failed :( (1595.18 seconds)
Afterwards we retried with trunk and its also failing:
root@a2eea4b5cfec:/work/libcloud# tox -e scrape-ec2-sizes
scrape-ec2-sizes: commands[0]> bash -c 'echo "Scrapping EC2 sizes, this may take up to 10 minutes or more since the actual JSON data we download and scrape is very large"'
Scrapping EC2 sizes, this may take up to 10 minutes or more since the actual JSON data we download and scrape is very large
scrape-ec2-sizes: commands[1]> bash -c 'python contrib/scrape-ec2-sizes.py'
Scraping size data, this may take up to 10-15 minutes...
Using data from existing cached file /tmp/ec.json
scrape-ec2-sizes: exit -9 (576.53 seconds) /work/libcloud> bash -c 'python contrib/scrape-ec2-sizes.py' pid=1832
scrape-ec2-sizes: FAIL code -9 (576.78=setup[0.19]+cmd[0.06,576.53] seconds)
evaluation failed :( (577.21 seconds)
Also the mentioned example (see [2]) in the documentation is outdated. The list EC2_REGIONS does not exist anymore in contrib/scrape-ec2-prices.py.
The change we did seems to be sufficient to delete a VM, are there other tests to run?
[1] https://libcloud.readthedocs.io/en/latest/development.html#updating-ec2-sizing-and-supported-regions-data [2] https://github.com/apache/libcloud/commit/762f0e5623b6f9837204ffe27d825b236c9c9970
Checklist (tick everything that applies)
- [x] Code linting (required, can be done after the PR checks)
- [ ] Documentation
- [ ] Tests
- [ ] ICLA (required for bigger changes)
Is this PR ready for review or someone can investigate the problem? I also tried running the tox, scrape-ec2-sizes.py downloads a 6GB ec.json file, which looks okay. But this script prints this error.
Using data from existing cached file /tmp/ec.json
scrape-ec2-sizes: exit -9 (837.75 seconds) /opt/libcloud> bash -c 'python contrib/scrape-ec2-sizes.py' pid=28
scrape-ec2-sizes: FAIL ✖ in 13 minutes 58.98 seconds
thanks.
Codecov Report
All modified and coverable lines are covered by tests :white_check_mark:
Project coverage is 83.40%. Comparing base (
1117987) to head (0a5bc0d). Report is 45 commits behind head on trunk.
Additional details and impacted files
@@ Coverage Diff @@
## trunk #2030 +/- ##
=======================================
Coverage 83.40% 83.40%
=======================================
Files 353 353
Lines 81685 81685
Branches 8632 8632
=======================================
Hits 68124 68124
Misses 10738 10738
Partials 2823 2823
| Files with missing lines | Coverage Δ | |
|---|---|---|
| libcloud/compute/constants/ec2_instance_types.py | 100.00% <ø> (ø) |
|
| ...d/compute/constants/ec2_region_details_complete.py | 100.00% <ø> (ø) |
|
| libcloud/storage/drivers/s3.py | 89.61% <ø> (ø) |
:rocket: New features to boost your workflow:
- :snowflake: Test Analytics: Detect flaky tests, report on failures, and find test suite problems.
Sorry for the delay. The PR (with somewhat related updates to AWS EC2 sizes and prices) has been merged into trunk.
Thanks for the contribution.