harbor
harbor copied to clipboard
Actual disk usage is bigger than reported storage usage in UI
We are running Harbor 2.3.0 on Redhat 7.9. On Harbor UI, utilization is showing around 19 TB; however, from the OS end, it shows 38TB utilization.
Mount Point is only used for harbor, and we are not sure why there is so much difference between harbor UI and actual OS.
Disk Space
Filesystem Size Used Avail Use% Mounted on /dev/mapper/vmharbor-vmharbor 41T 38T 867G 98% /vmharbor
Harbor UI
The actual disk space usage maybe different from the Harbor UI displayed size. some factors need to be considered.
- _upload folder disk usage. see https://github.com/goharbor/harbor/issues/15641
- If the GC job isn't run in a timely manner, some space might not be released.
- File system metadata might consume disk spaces
@stonezdj
I see below the "_uploads" dir under the repo using space. So you are saying it is safe to delete files under these directories?
I have one question: when we push or upload images on Harbor, they get stored under "/data/registry/docker/registry/v2/blobs" is that correct understanding ? and when images gets deleted and if GC doesn't cleanup things those are landed in "_uploads" ?
4.0K ./truesight-app-vulnerability-management-drm/_uploads 1016M ./truesight-common-tso-connector/_uploads 4.9G ./truesight-spring-auth-proxy/_uploads 4.0K ./secops-common-base-image-v17/_uploads 9.3G ./truesight-common-login/_uploads 955M ./secops-common-base-node-image/_uploads 7.4G ./secops-common-base-image/_uploads 2.3M ./alpine_baseimage/_uploads 1.4G ./truesight-common-itil/_uploads 18G ./truesight-common-tenant-onboarding-utility/_uploads 61M ./truesight-organization-service/_uploads 20G ./truesight-policy-service/_uploads 4.0K ./ade-python-alpine-baseimage/_uploads 17G ./truesight-app-drw/_uploads 4.0K ./truesight-app-vulnerability-management-portal/_uploads 3.6G ./secops-common-base-node-nonpm-image/_uploads 101M ./truesight-connectors-service/_uploads 4.0K ./truesight-app-vulnerability-management-drw/_uploads 30M ./truesight-users-service/_uploads 5.0G ./truesight-rsso-portal/_uploads 4.0K ./truesight-infra-ext-redis/_uploads 8.7G ./truesight-platform-portal/_uploads 4.0K ./truesight-common-workmanager/_uploads 9.7G ./truesight-common-tssa-connector/_uploads 1.9M ./truesight-configurations-service/_uploads 9.5G ./truesight-app-patch-manager-portal/_uploads 3.5M ./jgsqware/clairctl/_uploads 3.5M ./postgres/_uploads 19G ./truesight-catalog-service/_uploads 1012K ./truesight-stack-manager/_uploads 4.0K ./truesight-config-configurator/_uploads 4.0K ./truesight-common-orchestration-connector/_uploads 22G ./truesight-common-base-java-image/_uploads 7.3G ./truesight-identitymanagement-service/_uploads 9.6G ./truesight-common-tsna-connector/_uploads 5.5G ./truesight-resource-service/_uploads 7.9G ./truesight-common-scanner-connector/_uploads 1.2G ./truesight-workmanager-service/_uploads 1.1G ./truesight-common-exceptions-service/_uploads 30M ./truesight-tssp-nginx/_uploads 4.0K ./truesight-common-tagging/_uploads 3.2M ./remediate-deploy-db/_uploads 4.0K ./java/_uploads 1.3G ./truesight-app-patch-manager-core/_uploads 4.0K ./truesight-common-discovery-connector/_uploads 4.0K ./ia-deploy-db/_uploads 4.0K ./pm-activity-log-service/_uploads 3.1G ./truesight-common-discovery-onprem-connector/_uploads 4.0K ./truesight-infra-ext-consul/_uploads 12G ./truesight-common-tenable-connector/_uploads 19G ./truesight-activity-log-service/_uploads
One more question : Do we need to delete file from _upload by connecting registry container only OR we can delete it directly by accessing storage dir i.e "/data"
Please upgrade the Harbor to newer than v2.5.x, it can purge the _upload directory by default
@stonezdj Thank you so much for your support.
This issue is being marked stale due to a period of inactivity. If this issue is still relevant, please comment or remove the stale label. Otherwise, this issue will close in 30 days.
This issue was closed because it has been stalled for 30 days with no activity. If this issue is still relevant, please re-open a new issue.