Elijah Newren
Elijah Newren
--refs was added by commit 3e153806ff47 (lint-history: Add --refs argument, 2021-12-30)...and fixed in commit 9cf6121b34fe (lint-history: fix broken --refs option, 2022-10-03). So that one at least is already taken care...
Not a bug; it's how fast-export is designed. What exactly are you trying to get, though -- the original hash, or the new hash after the blob-callback runs? If you...
The current hash isn't actually available to filter-repo; the basic pipeline is: git fast-export | git filter-repo | git fast-import though filter-repo sets up this whole pipeline as well. Until...
If you want to use blob.original_id, just use blob.original_id. It's an available field for your use. Replacing blob.id with blob.original_id would be wrong since a filter may have modified the...
Ah, gotcha. Changing that would be even more wrong as it'd guarantee that commits continued using old blobs regardless of whether they were modified by filters. (When commits are written...
It's not that it filtered incorrectly, it's that it exits with an error before completing: ```$ git filter-repo --subdirectory-filter libexec/ftpd Parsed 363989 commits New history written in 260.90 seconds; now...
Where did my comment go last night to ignore the above comment because I was using a dirty version of git? Anyway, I reproduced last night but need to find...
I don't understand what good that would do. If you have any blob repeated a trillion times, Git will only store one copy of it, so removing highly duplicated blobs...
Sorry, I'm still confused. My scenario 1 was someone committing a blob (let's say it's hash abbreviates to deadbeef01) at some path (let's say subdir/dir2/somefile.ext), then updating those contents periodically,...
Ah, a list of paths with a count of changes to the content stored at that path. Thanks for the explanation. Maybe in a file named "frequency-of-changes-per-path.txt" ? I'd be...