Remove checked-in tools
Trying to remove all Git LFS files to remove the following error on build (or even checking out UnitsNet):
Cloning into './UnitsNet2'...
remote: Enumerating objects: 71476, done.
remote: Counting objects: 100% (23610/23610), done.
remote: Compressing objects: 100% (3957/3957), done.
remote: Total 71476 (delta 21266), reused 21395 (delta 19439), pack-reused 47866
Receiving objects: 100% (71476/71476), 57.81 MiB | 6.75 MiB/s, done.
Resolving deltas: 100% (63157/63157), done.
Updating files: 100% (1968/1968), done.
Downloading Tools/7-zip/7za.exe (536 KB)
Error downloading object: Tools/7-zip/7za.exe (fa252e5): Smudge error: Error downloading Tools/7-zip/7za.exe (fa252e501332b7486a972e7e471cf6915daa681af35c6aa102213921093eb2a3): batch response: This repository is over its data quota. Account responsible for LFS bandwidth should purchase more data packs to restore access.
The machines used by GitHub have 7Zip and NuGet preinstalled.
(I don't know what sfk.exe is!)
Thanks, I got the notification yesterday that the LFS bandwidth of 1 GB was used up, 1.5 had been used. Storage was fine.
I guess the repo is being checked out a lot by CI systems, but I'm astonished that I'm hitting bandwidth limits on a fairly small projects with very small binary files.
I bought a LFS data pack to fix it short term, but I'm happy to remove the binaries we don't need.
I think we have some dependencies on nuget.exe at least. I don't think we can change the scripts to assume nuget.exe is in the PATH on all systems. Especially for those who check out the repo and run build.bat , it should "just work".
At my work, we use init scripts to download and install any dependencies required to build or run. Mostly Docker stuff. I'm thinking we can add an init.bat that downloads Tools\nuget.exe if it does not already exist? Same with any other dependencies we have, maybe 7zip.
Slightly unrelated, but we could also switch to GitHub actions, or my favorite Azure DevOps, but it's time I struggle to find. AppVeyor is excruciatingly slow at times, but it has mostly worked well.
Thanks, I got the notification yesterday that the LFS bandwidth of 1 GB was used up, 1.5 had been used. Storage was fine.
I guess the repo is being checked out a lot by CI systems, but I'm astonished that I'm hitting bandwidth limits on a fairly small projects with very small binary files.
I bought a LFS data pack to fix it short term, but I'm happy to remove the binaries we don't need.
I think we have some dependencies on nuget.exe at least. I don't think we can change the scripts to assume
nuget.exeis in the PATH on all systems. Especially for those who check out the repo and runbuild.bat, it should "just work".At my work, we use init scripts to download and install any dependencies required to build or run. Mostly Docker stuff. I'm thinking we can add an
init.batthat downloadsTools\nuget.exeif it does not already exist? Same with any other dependencies we have, maybe 7zip.We could also switch to GitHub actions, or my favorite Azure DevOps, but it's time I struggle to find. AppVeyor is excruciatingly slow at times.
Or better yet, dive into #693 and never worry about what's in a users path again 😉
But I would definitely be in favor of moving to ADO or GitHub actions. I am familiar with ADO but not Actions.
Oh yeah! Remote environment is really compelling. Have you tried it?
Still, I do think the init script is a good idea to make it real easy to checkout and build/run, as I would expect for any github project. Remote container is a nice supplement, but I wouldn't want to enforce it.
Oh yeah! Remote environment is really compelling. Have you tried it?
Still, I do think the init script is a good idea to make it real easy to checkout and build/run, as I would expect for any github project. Remote container is a nice supplement, but I wouldn't want to enforce it.
I have tried it with some of my own projects and seen it used in some other open source repos. Really it's great. Especially Linux ones where you end up in some CMake dependency / toolchain craziness.
That said, it won't get around this issue. Everyone who clones UnitsNet, including CI builds, are contributing to that bandwidth limit. How do we get around this in the meantime?
I bought the LFS data pack, so we are good for at least 30 days, or until I unsubscribe.
Well, if you ever feel like setting up a remote setup for UnitsNet, that would be really cool!
Well, if you ever feel like setting up a remote setup for UnitsNet, that would be really cool!
I will try to find some time. Easier said than done though 😄
Codecov Report
Patch and project coverage have no change.
Comparison is base (
9f6c315) 84% compared to head (f8c1d1d) 84%.
Additional details and impacted files
@@ Coverage Diff @@
## master #1052 +/- ##
======================================
Coverage 84% 84%
======================================
Files 334 334
Lines 31827 31827
======================================
Hits 26811 26811
Misses 5016 5016
:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Do you have feedback about the report comment? Let us know in this issue.
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.
Pushed some changes, think this is ready to go. Waiting for PR build to see that everything works.
Queued a CI build too: https://dev.azure.com/unitsnet/Units.NET/_build/results?buildId=541&view=results