Support git lfs
Cloning a git-lfs repository fails. I used the .insteadOf configuration to rewrite URLs, followed by normal git clone.
The following error is reported (git client-side):
Downloading UnityProjects/NewProject/Assets/ETopo.obj (20 MB)
Error downloading object: UnityProjects/NewProject/Assets/ETopo.obj (cf2da21): Smudge error: Error downloading UnityProjects/NewProject/Assets/ETopo.obj (cf2da21ba617d8a78d3cf02a984da6365cace149883bdbab2a214be4d52857f0): batch response: Fatal error: Server error: http://localhost:8088/jobh:<redacted>@bitbucket.org/kalkulo/test.git/info/lfs/objects/batch
On the git-cache-http-server side:
Called from Main.getParams (/usr/local/lib/node_modules/git-cache-http-server/bin/git-cache-http-server.js line 65)
Called from Main.handleRequest (/usr/local/lib/node_modules/git-cache-http-server/bin/git-cache-http-server.js line 105)
Called from events.js line 126
Called from events.js line 214
Called from _http_server.js line 619
Called from _http_common.js line 115
POST /jobh:<redacted>@bitbucket.org/kalkulo/test.git/info/lfs/objects/batch
ERROR: Cannot deal with url
Looking at the git-lfs spec, it might be as simple as pass-through for the "batch" request (doc) and caching the "transfer" result (doc). Perhaps optionally ignoring the authentication token when matching the cached transfer.
I'll take a first look at this... thanks for the relevant links.
Maybe the initial step should be to make sure LFS can pass through and work, without actually worrying about caching the objects themselves.
As for caching the API responses (if I understood "caching the 'transfer' result" correctly), I need to look into how the authentication works. As far as I know (not much, only from debugging a few annoying issues), some providers use S3 with credentials that are only valid for 10 minutes or so.
- I would very much welcome a pull request for LFS (even without object caching)
- someone else has done some work on this (diff master...RossRH:master)