git-cliff
git-cliff copied to clipboard
feat(bitbucket): support self-hosted instances
Atlassian has different APIs for the Cloud and Server (self-hosted) version of Bitbucket. Therefore, a dedicated remote is needed to handle the self-hosted version.
Description
Motivation and Context
closes #762
How Has This Been Tested?
Screenshots / Logs (if applicable)
Types of Changes
- [ ] Bug fix (non-breaking change which fixes an issue)
- [x] New feature (non-breaking change which adds functionality)
- [ ] Breaking change (fix or feature that would cause existing functionality to change)
- [ ] Documentation (no code change)
- [ ] Refactor (refactoring production code)
- [ ] Other
Checklist:
:warning: Please install the to ensure uploads and comments are reliably processed by Codecov.
Codecov Report
Attention: Patch coverage is 19.81132% with 85 lines in your changes missing coverage. Please review.
Project coverage is 38.64%. Comparing base (
7415289) to head (969a74c). Report is 21 commits behind head on main.
:exclamation: Your organization needs to install the Codecov GitHub app to enable full functionality.
Additional details and impacted files
@@ Coverage Diff @@
## main #763 +/- ##
==========================================
- Coverage 40.02% 38.64% -1.37%
==========================================
Files 20 22 +2
Lines 1642 1703 +61
==========================================
+ Hits 657 658 +1
- Misses 985 1045 +60
| Flag | Coverage Δ | |
|---|---|---|
| unit-tests | 38.64% <19.82%> (-1.37%) |
:arrow_down: |
Flags with carried forward coverage won't be shown. Click here to find out more.
:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.
@orhun absolutely I'll continue on it. Just haven't have much time lately to continue on where I started.
I really like the idea of switching the implementation based on whether a custom URI was used. Felt really odd to have these two configs despite they pretty much achieving the same.
A few things are obviously still missing here and there. Especially testing is an issue I guess.
To have a self-hosted Bitbucket server running is easy as you can just grab the latest Docker image and run it, but then you need to get a 30-day license and so on. Way too many steps to verify the implementation.
As far as I understand the API isn't really evolving much (maybe the server variant gets the update to API v2 at some point eventually). So my idea is to record a few queries and save them in the repo, anonymized of course. Then we can see if we use a full blown HTTP mocking crate or simply deserialize the files into expected responses. That way we can at least verify the implementation to some degree without having to run a full server every time. Might be a good solution for the other backends as well.
absolutely I'll continue on it. Just haven't have much time lately to continue on where I started.
Thanks for showing interest! Just an idea, maybe it makes sense to start this PR over since we won't have two separate functionalities for this :)
To have a self-hosted Bitbucket server running is easy as you can just grab the latest Docker image and run it, but then you need to get a 30-day license and so on. Way too many steps to verify the implementation.
Wow, that's really bad ;/
So my idea is to record a few queries and save them in the repo, anonymized of course. Then we can see if we use a full blown HTTP mocking crate or simply deserialize the files into expected responses. That way we can at least verify the implementation to some degree without having to run a full server every time. Might be a good solution for the other backends as well.
Loved this idea! This will also save us from spurious CI errors due to networking issues that happen occasionally with GitLab etc.
Would love to have this in!
No need to start it over, we can use the power of rebase and force pushes :wink:. But it's basically somewhat of a start over, haha.
First step done, now the Bitbucket implementation picks the Cloud or Server variant based on whether a custom API URL is set.
Will get to add in some tests and finish up some missing docs (clippy will probably fail) next.
Perfect!
Just lmk when this is ready for review :)
Definitely! The basics are ready, just didn't get a chance to collect some API calls for some automated tests yet.
Once I feel it's ready I'll move it from a draft into a regular PR.
Just a small heads up, I didn't forget about the PR. Priorities are quite somewhere else at the moment, so I simply didn't get to pull over and sanitize those request/response samples.
But I'm sure I'll get to it sometime.
No worries, sounds good :)
Hey, lost interest? :) (it's totally fine)
Hey, sorry for that. I should have probably written a comment first and then closed the PR (actually auto-closed due to deleting the branch).
This was mostly for work and the priority on it was pretty low. Plus the latest changes made a rebase pretty hard, and I forgot half of my PR changes as well because I haven't touched it in a long time.
So I figured it's best to close this for now and then rather make a fresh PR from scratch with the missing bits included, that we discussed here so far.