openllmetry
openllmetry copied to clipboard
🚀 Feature: alpine support
Which component is this feature for?
Traceloop SDK
🔖 Feature description
Lets add CI dimensions for both popular operating systems and also architectures. Specifically, we should test alpine+aarch64 even if this is via docker.
🎤 Why is this feature needed ?
I noticed folks in otel are pointed at openllmetry, and started to dig. I tried the traceloop sdk which seems to be a sort of auto-instrumentation. For perhaps this reason, it ends up depending on everything and that can cause platform conflicts. Notably, I got a conflict building on alpine+aarch64 which isn't an issue on ubuntu.
✌️ How do you aim to achieve this?
github actions can be a matrix which includes docker as necessary to cover alpine+aarch64. It doesnt' need to be a full build, as it could just be a sample project
🔄️ Additional Information
No response
👀 Have you spent some time to check if this feature request has been raised before?
- [X] I checked and didn't find similar issue
Are you willing to submit PR?
None
In case this helps, this is a way to have it break
ARG base_image=docker.io/python:3.12.4-alpine3.20
# Take the Pipfile stuff, used locally and export it to requirements.
# This makes it easier to diagnose pip concerns that only pop up in Docker.
FROM $base_image as requirements
RUN python -m pip install --upgrade pip
RUN pip install -U pipenv
COPY Pipfile /
COPY Pipfile.lock /
RUN pipenv requirements > /requirements.txt
# Now, make the runtime image which uses normal python
FROM $base_image
RUN python -m pip install --upgrade pip
COPY --from=requirements /requirements.txt /tmp/requirements.txt
RUN pip install -r /tmp/requirements.txt
Pipfile
[[source]]
url = "https://pypi.org/simple"
verify_ssl = true
name = "pypi"
[packages]
ollama = "*"
opentelemetry-instrumentation-ollama = "*"
traceloop-sdk = "*"
[dev-packages]
[requires]
python_version = "3.12"
pip error
=> ERROR [openllmetry-python-traceloop stage-1 4/4] RUN pip install -r /tmp/requirements.txt 2.5s
------
> [openllmetry-python-traceloop stage-1 4/4] RUN pip install -r /tmp/requirements.txt:
0.263 Collecting annotated-types==0.7.0 (from -r /tmp/requirements.txt (line 2))
0.302 Downloading annotated_types-0.7.0-py3-none-any.whl.metadata (15 kB)
0.330 Collecting anthropic==0.30.1 (from -r /tmp/requirements.txt (line 3))
0.341 Downloading anthropic-0.30.1-py3-none-any.whl.metadata (18 kB)
0.364 Collecting anyio==4.4.0 (from -r /tmp/requirements.txt (line 4))
0.376 Downloading anyio-4.4.0-py3-none-any.whl.metadata (4.6 kB)
0.398 Collecting backoff==2.2.1 (from -r /tmp/requirements.txt (line 5))
0.411 Downloading backoff-2.2.1-py3-none-any.whl.metadata (14 kB)
0.438 Collecting certifi==2024.7.4 (from -r /tmp/requirements.txt (line 6))
0.448 Downloading certifi-2024.7.4-py3-none-any.whl.metadata (2.2 kB)
0.506 Collecting charset-normalizer==3.3.2 (from -r /tmp/requirements.txt (line 7))
0.519 Downloading charset_normalizer-3.3.2-cp312-cp312-musllinux_1_1_aarch64.whl.metadata (33 kB)
0.542 Collecting colorama==0.4.6 (from -r /tmp/requirements.txt (line 8))
0.551 Downloading colorama-0.4.6-py2.py3-none-any.whl.metadata (17 kB)
0.570 Collecting deprecated==1.2.14 (from -r /tmp/requirements.txt (line 9))
0.580 Downloading Deprecated-1.2.14-py2.py3-none-any.whl.metadata (5.4 kB)
0.597 Collecting distro==1.9.0 (from -r /tmp/requirements.txt (line 10))
0.605 Downloading distro-1.9.0-py3-none-any.whl.metadata (6.8 kB)
0.628 Collecting filelock==3.15.4 (from -r /tmp/requirements.txt (line 11))
0.639 Downloading filelock-3.15.4-py3-none-any.whl.metadata (2.9 kB)
0.664 Collecting fsspec==2024.6.1 (from -r /tmp/requirements.txt (line 12))
0.676 Downloading fsspec-2024.6.1-py3-none-any.whl.metadata (11 kB)
0.707 Collecting googleapis-common-protos==1.63.2 (from -r /tmp/requirements.txt (line 13))
0.716 Downloading googleapis_common_protos-1.63.2-py2.py3-none-any.whl.metadata (1.5 kB)
0.919 Collecting grpcio==1.64.1 (from -r /tmp/requirements.txt (line 14))
0.929 Downloading grpcio-1.64.1.tar.gz (12.2 MB)
1.393 ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 12.2/12.2 MB 29.8 MB/s eta 0:00:00
2.147 Preparing metadata (setup.py): started
2.358 Preparing metadata (setup.py): finished with status 'error'
2.360 error: subprocess-exited-with-error
2.360
2.360 × python setup.py egg_info did not run successfully.
2.360 │ exit code: 1
2.360 ╰─> [14 lines of output]
2.360 Traceback (most recent call last):
2.360 File "<string>", line 2, in <module>
2.360 File "<pip-setuptools-caller>", line 34, in <module>
2.360 File "/tmp/pip-install-pye1fxeq/grpcio_53c876975fb941ffbb6de71925636dfa/setup.py", line 271, in <module>
2.360 if check_linker_need_libatomic():
2.360 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
2.360 File "/tmp/pip-install-pye1fxeq/grpcio_53c876975fb941ffbb6de71925636dfa/setup.py", line 215, in check_linker_need_libatomic
2.360 cpp_test = subprocess.Popen(
2.360 ^^^^^^^^^^^^^^^^^
2.360 File "/usr/local/lib/python3.12/subprocess.py", line 1026, in __init__
2.360 self._execute_child(args, executable, preexec_fn, close_fds,
2.360 File "/usr/local/lib/python3.12/subprocess.py", line 1955, in _execute_child
2.360 raise child_exception_type(errno_num, err_msg, err_filename)
2.360 FileNotFoundError: [Errno 2] No such file or directory: 'c++'
2.360 [end of output]
2.360
2.360 note: This error originates from a subprocess, and is likely not a problem with pip.
2.361 error: metadata-generation-failed
2.361
2.361 × Encountered error while generating package metadata.
2.361 ╰─> See above for output.
2.361
2.361 note: This is an issue with the package mentioned above, not pip.
2.361 hint: See above for details.
------
failed to solve: process "/bin/sh -c pip install -r /tmp/requirements.txt" did not complete successfully: exit code: 1
Thanks for this @codefromthecrypt! I'll play with it later today. From a first glance, looks like the dependency problem is coming from grpc, which is actually an otel dependency. Otel official docs suggest installing some dependencies on Alpine to make this work. I wonder if the error persists with these dependencies installed.
indeed grpc requires some additional platform steps, possibly limited to this:
# Install build dependencies for grpcio
RUN apk add --no-cache g++ make linux-headers
I think possibly the way out is to have a disclaimer that using the auto-instrumentation approach means the target needs all possible dependencies installed, not just requirements of what you are using. This means a longer pip install and some system prerequisites vs a specific instrumentation.
Not sure the best way to say this, but I think that's the underlying issue and those having platform troubles may be better off cherry-picking what they need.
Do you think it's an issue with us that we need to document specifically? Or is it just something folks that use otel needs to know anyway
I would investigate the root of that grpc dep. if only for span export, it might be better to switch to http after verifying with others what I understand as the de-facto transport of today.