llama-stack
llama-stack copied to clipboard
llama stack build with docker not working: config yaml not found
Here is the full trace of logs:
Enter a name for your Llama Stack (e.g. my-local-stack): test Enter the image type you want your Llama Stack to be built as (docker or conda): docker
Llama Stack is composed of several APIs working together. Let's configure the providers (implementations) you want to use for these APIs.
Enter provider for the inference API: (default=meta-reference): meta-reference Enter provider for the safety API: (default=meta-reference): meta-reference Enter provider for the agents API: (default=meta-reference): meta-reference Enter provider for the memory API: (default=meta-reference): meta-reference Enter provider for the telemetry API: (default=meta-reference): meta-reference
(Optional) Enter a short description for your Llama Stack: test Dockerfile created successfully in /var/folders/kq/8rs_ws5s72x3dmpzh_z6_zc80000gn/T/tmp.zQDghZdAmH/DockerfileFROM python:3.10-slim WORKDIR /app
RUN apt-get update && apt-get install -y iputils-ping net-tools iproute2 dnsutils telnet curl wget telnet procps psmisc lsof traceroute bubblewrap && rm -rf /var/lib/apt/lists/*
RUN pip install llama-stack RUN pip install fastapi fire httpx uvicorn accelerate blobfile fairscale fbgemm-gpu==0.8.0 torch torchvision transformers zmq accelerate codeshield torch transformers matplotlib pillow pandas scikit-learn aiosqlite psycopg2-binary redis blobfile chardet pypdf tqdm numpy scikit-learn scipy nltk sentencepiece transformers faiss-cpu RUN pip install torch --index-url https://download.pytorch.org/whl/cpu RUN pip install sentence-transformers --no-deps
This would be good in production but for debugging flexibility lets not add it right now
We need a more solid production ready entrypoint.sh anyway
ENTRYPOINT ["python", "-m", "llama_stack.distribution.server.server"]
ADD testllama/lib/python3.11/site-packages/llama_stack/configs/distributions/docker/test-build.yaml ./llamastack-build.yaml
- docker build -t llamastack-test -f /var/folders/kq/8rs_ws5s72x3dmpzh_z6_zc80000gn/T/tmp.zQDghZdAmH/Dockerfile /Users/franciscojose.maldonado/_VOIS/hosted-models/testllama/lib/python3.11/site-packages [+] Building 1.6s (12/12) FINISHED docker:desktop-linux => [internal] load build definition from Dockerfile 0.0s => => transferring dockerfile: 1.16kB 0.0s => [internal] load metadata for docker.io/library/python:3.10-slim 1.5s => [internal] load .dockerignore 0.0s => => transferring context: 2B 0.0s => CANCELED [1/8] FROM docker.io/library/python:3.10-slim@sha256:80619a5 0.0s => => resolve docker.io/library/python:3.10-slim@sha256:80619a5316afae70 0.0s => => sha256:80619a5316afae7045a3c13371b0ee670f39bac46ea 9.13kB / 9.13kB 0.0s => => sha256:80cd7261f1d8c75b18c5804f8045ef9601cf87d631e 1.75kB / 1.75kB 0.0s => => sha256:4b31b4d67fb996eb2f30873969bfee7a7256e029338 5.24kB / 5.24kB 0.0s => [internal] load build context 0.0s => => transferring context: 2B 0.0s => CACHED [2/8] WORKDIR /app 0.0s => CACHED [3/8] RUN apt-get update && apt-get install -y iputils- 0.0s => CACHED [4/8] RUN pip install llama-stack 0.0s => CACHED [5/8] RUN pip install fastapi fire httpx uvicorn accelerate bl 0.0s => CACHED [6/8] RUN pip install torch --index-url https://download.pytor 0.0s => CACHED [7/8] RUN pip install sentence-transformers --no-deps 0.0s => ERROR [8/8] ADD testllama/lib/python3.11/site-packages/llama_stack/co 0.0s
[8/8] ADD testllama/lib/python3.11/site-packages/llama_stack/configs/distributions/docker/test-build.yaml ./llamastack-build.yaml:
Dockerfile:16
14 | # ENTRYPOINT ["python", "-m", "llama_stack.distribution.server.server"]
15 |
16 | >>> ADD testllama/lib/python3.11/site-packages/llama_stack/configs/distributions/docker/test-build.yaml ./llamastack-build.yaml
17 |
ERROR: failed to solve: failed to compute cache key: failed to calculate checksum of ref 6e3e8a01-bae9-4931-bbf3-b0482435687d::gqnn34no46dhkpcusz5dxj7w0: "/testllama/lib/python3.11/site-packages/llama_stack/configs/distributions/docker/test-build.yaml": not found Failed to build target test with return code 1
And the yaml is in this path: