llama-stack icon indicating copy to clipboard operation
llama-stack copied to clipboard

Why can't I use my llama stack? Can anyone help me?

Open xie85 opened this issue 11 months ago • 0 comments

[2401213419@l12gpu30 llama3.2]$ llama stack run llapku --port 8080 Using config /lustre/home/2401213419/.llama/builds/conda/llapku-run.yaml Resolved 19 providers inner-inference => meta-reference models => routing_table inference => autorouted inner-safety => meta-reference shields => routing_table safety => autorouted inner-memory => meta-reference memory_banks => routing_table memory => autorouted agents => meta-reference inner-datasetio => meta-reference datasets => routing_table datasetio => autorouted inner-scoring => meta-reference scoring_functions => routing_table scoring => autorouted eval => meta-reference telemetry => meta-reference inspect => builtin

Traceback (most recent call last): File "/lustre/home/2401213419/software/miniconda3/envs/llamastack-llapku/lib/python3.10/runpy.py", line 196, in _run_module_as_main return _run_code(code, main_globals, None, File "/lustre/home/2401213419/software/miniconda3/envs/llamastack-llapku/lib/python3.10/runpy.py", line 86, in _run_code exec(code, run_globals) File "/lustre/home/2401213419/software/miniconda3/envs/llamastack-llapku/lib/python3.10/site-packages/llama_stack/distribution/server/server.py", line 368, in fire.Fire(main) File "/lustre/home/2401213419/software/miniconda3/envs/llamastack-llapku/lib/python3.10/site-packages/fire/core.py", line 135, in Fire component_trace = _Fire(component, args, parsed_flag_args, context, name) File "/lustre/home/2401213419/software/miniconda3/envs/llamastack-llapku/lib/python3.10/site-packages/fire/core.py", line 468, in _Fire component, remaining_args = _CallAndUpdateTrace( File "/lustre/home/2401213419/software/miniconda3/envs/llamastack-llapku/lib/python3.10/site-packages/fire/core.py", line 684, in _CallAndUpdateTrace component = fn(*varargs, **kwargs) File "/lustre/home/2401213419/software/miniconda3/envs/llamastack-llapku/lib/python3.10/site-packages/llama_stack/distribution/server/server.py", line 300, in main impls = asyncio.run(resolve_impls(config, get_provider_registry(), dist_registry)) File "/lustre/home/2401213419/software/miniconda3/envs/llamastack-llapku/lib/python3.10/asyncio/runners.py", line 44, in run return loop.run_until_complete(main) File "/lustre/home/2401213419/software/miniconda3/envs/llamastack-llapku/lib/python3.10/asyncio/base_events.py", line 649, in run_until_complete return future.result() File "/lustre/home/2401213419/software/miniconda3/envs/llamastack-llapku/lib/python3.10/site-packages/llama_stack/distribution/resolver.py", line 191, in resolve_impls impl = await instantiate_provider( File "/lustre/home/2401213419/software/miniconda3/envs/llamastack-llapku/lib/python3.10/site-packages/llama_stack/distribution/resolver.py", line 287, in instantiate_provider impl = await fn(*args) File "/lustre/home/2401213419/software/miniconda3/envs/llamastack-llapku/lib/python3.10/site-packages/llama_stack/providers/impls/meta_reference/inference/init.py", line 16, in get_provider_impl from .inference import MetaReferenceInferenceImpl File "/lustre/home/2401213419/software/miniconda3/envs/llamastack-llapku/lib/python3.10/site-packages/llama_stack/providers/impls/meta_reference/inference/inference.py", line 18, in from .generation import Llama File "/lustre/home/2401213419/software/miniconda3/envs/llamastack-llapku/lib/python3.10/site-packages/llama_stack/providers/impls/meta_reference/inference/generation.py", line 18, in import torch File "/lustre/home/2401213419/software/miniconda3/envs/llamastack-llapku/lib/python3.10/site-packages/torch/init.py", line 367, in from torch._C import * # noqa: F403 ImportError: /lustre/home/2401213419/software/miniconda3/envs/llamastack-llapku/lib/python3.10/site-packages/torch/lib/../../nvidia/cusparse/lib/libcusparse.so.12: undefined symbol: __nvJitLinkComplete_12_4, version libnvJitLink.so.12 Error occurred in script at line: 40

xie85 avatar Nov 05 '24 18:11 xie85