docker-llama2-chat
docker-llama2-chat copied to clipboard
OSError: You seem to have cloned a repository without having git-lfs installed
OSError: You seem to have cloned a repository without having git-lfs installed
按照教程里做的: https://soulteary.com/2023/07/21/use-docker-to-quickly-get-started-with-the-chinese-version-of-llama2-open-source-large-model.html
运行容器:sh scripts/run-7b-cn.sh 报错:
Various files include modifications (c) NVIDIA CORPORATION & AFFILIATES. All rights reserved.
This container image and its contents are governed by the NVIDIA Deep Learning Container License. By pulling and using the container, you accept the terms and conditions of this license: https://developer.nvidia.com/ngc/nvidia-deep-learning-container-license
NOTE: CUDA Forward Compatibility mode ENABLED. Using CUDA 12.1 driver version 530.30.02 with kernel driver version 525.105.17. See https://docs.nvidia.com/deploy/cuda-compatibility/ for details.
Loading checkpoint shards: 0%| | 0/3 [00:00<?, ?it/s] Traceback (most recent call last): File "/usr/local/lib/python3.10/dist-packages/transformers/modeling_utils.py", line 460, in load_state_dict return torch.load(checkpoint_file, map_location="cpu") File "/usr/local/lib/python3.10/dist-packages/torch/serialization.py", line 883, in load return _legacy_load(opened_file, map_location, pickle_module, **pickle_load_args) File "/usr/local/lib/python3.10/dist-packages/torch/serialization.py", line 1101, in _legacy_load magic_number = pickle_module.load(f, **pickle_load_args) _pickle.UnpicklingError: invalid load key, 'v'.
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/app/app.py", line 6, in git lfs install
followed by git lfs pull
in the folder you cloned.