roam-to-git
roam-to-git copied to clipboard
Git process is terminated before it finishes pushing the repository
Describe the bug The backup seems to complete successfully, but workflow terminates before git can push changes to the repository.
To Reproduce Use the workflow with large-enough Road database?
Expected behavior Git process is allowed to complete, instead it's terminated by https://github.com/MatthieuBizien/roam-to-git/blob/2381924739a6f848165c483d6d0a2af3cdca1916/roam_to_git/scrapping.py#L290
Traceback See log in https://github.com/Stvad/roam-notes-workflow/actions/runs/8224616595/job/22488653469
Please complete the following information:
- OS: Ubuntu-latest
- Do you use Github Action?: Yes
- Do you use multiple Roam databases?: No
- Does roam-to-git use to work for you? When precisely did it stopped to work?: Restarting using it after long period of inactivity
- Does some backup runs are still working? No
I've tried disabling that cleanup with https://github.com/MatthieuBizien/roam-to-git/commit/558917b92fff9c4b7f9d3b622bdd4e14314f62c5 but it seems the issue is that for some reason parent process is terminated, so just disabling cleanup is not sufficient 🤔
Ok, can repro locally on macos as-well
ah, got to the root of it =\
remote: error: See https://gh.io/lfs for more information.
remote: error: File json/dbname.json is 126.14 MB; this exceeds GitHub's file size limit of 100.00 MB
remote: error: GH001: Large files detected. You may want to try Git Large File Storage - https://git-lfs.github.com.
Ok, after configuring LFS manually and commiting the .gitattributes file - things work now https://docs.github.com/en/repositories/working-with-files/managing-large-files/configuring-git-large-file-storage
ah, but github LFS has 1gb limit and it seems it uploads a new file each time instead of diffing, so ended up chunking the json into smaller parts instead: https://github.com/Stvad/roam-notes-workflow/blob/master/.github/workflows/run_backup.yml