Storage for opentelemetry-ebpf-profiler coredump test suite
As part of the Elastic Universal Profiling donation to OpenTelemetry, the coredump files that comprise the coredump test suite (which is the main testing mechanism for all the different language runtime unwinders) need to be transferred to OpenTelemetry.
Currently, they are sitting in an Elastic S3 bucket and are inaccessible to the public: 618 objects each individually compressed with a total size of ~2.8GB.
CC: @tigrannajaryan
We may be able to set you up with something similar in Oracle Cloud where the CNCF has a lot of credits.
cc @austinlparker as the Admin for our Oracle Cloud Account (https://github.com/open-telemetry/community/blob/main/assets.md#oracle-cloud-account)
Yeah we can do this in our OCI instance. Feel free to reach out on Slack to discuss the details.
I've created a bucket with a public (read) access URL, as well as an account and management keys for it. The credentials and information are in a new 1Password vault that the following individuals have been invited to -
[email protected] [email protected] [email protected] [email protected] [email protected]
@felixge @petethepig please reach out to @austinlparker with your emails in order to be added.
I did accept the invite and did setup 2FA for my 1Password account, but there are not shared vaults. Is there something I have to do, besides accepting the invite to get access to credentials for the OCI instance?
@tylerbenson mentioned another option for storing the coredump test suite in slack: https://docs.github.com/en/repositories/working-with-files/managing-large-files/about-git-large-file-storage
this seems like a nice option which would also preserve history, what do you all think?
it looks like the large files could be added directly to the opentelemetry-ebpf-profiler repository, or if there are concerns we could create a separate repo for them under the open-telemetry org
@tylerbenson mentioned another option for storing the coredump test suite in slack: https://docs.github.com/en/repositories/working-with-files/managing-large-files/about-git-large-file-storage
this seems like a nice option which would also preserve history, what do you all think?
it looks like the large files could be added directly to the opentelemetry-ebpf-profiler repository, or if there are concerns we could create a separate repo for them under the open-telemetry org
The proposed Oracle Cloud solution exposes an S3-compatible API which would require minimal engineering effort from us to adapt, as we're already pushing to S3.
Besides additional engineering effort, possible issues I see with git-lfs are storage and bandwidth limits which are quite conservative by default.
it looks like the large files could be added directly to the opentelemetry-ebpf-profiler repository, or if there are concerns we could create a separate repo for them under the open-telemetry org
if you go with Git LFS approach I would recommend a separate repo included as a submodule.
I did accept the invite and did setup 2FA for my 1Password account, but there are not shared vaults. Is there something I have to do, besides accepting the invite to get access to credentials for the OCI instance?
Did you check after I validated your account?
@florianl hey - you wouldn't have seen shared vaults until i confirmed your account, which I did yesterday afternoon. please check again.
@austinlparker I can not find an invitation (also not in the spam folder). Can you check or invite again?
[..] i confirmed your account, which I did yesterday afternoon. please check again.
Thanks @austinlparker - I can see and access now the opentelemetry-profiling vault. :+1:
@christos68k @florianl @rockdaboot is everything good here, can we close the issue? thanks!
I have verified that the provided credentials work for me using the oci CLI tooling. So closing this issue is fine with me.