FastChat icon indicating copy to clipboard operation
FastChat copied to clipboard

What's the problem?

Open Burgeon0110 opened this issue 2 years ago • 9 comments
trafficstars

Execute python3 -m fastchat.model.apply_delta --base-model-path E:\llama-13b-hf --target-model-path D:\vicuna-13b-delta-v0 --delta-path D:\13b --low-cpu-mem, then show: Traceback (most recent call last): File "D:\FastChat-main\FastChat-main\fastchat\model\apply_delta.py", line 161, in apply_delta_low_cpu_mem( File "D:\FastChat-main\FastChat-main\fastchat\model\apply_delta.py", line 71, in apply_delta_low_cpu_mem base_tokenizer = AutoTokenizer.from_pretrained(base_model_path, use_fast=False) File "E:\Python\Python311\Lib\site-packages\transformers\models\auto\tokenization_auto.py", line 699, in from_pretrained raise ValueError( ValueError: Tokenizer class LLaMATokenizer does not exist or is not currently imported.

Burgeon0110 avatar Apr 24 '23 08:04 Burgeon0110

modify llama-13b-hf/tokenizer_config.json , change "LLaMATokenizer" to "LlamaTokenizer".

QqqingYuan avatar Apr 24 '23 08:04 QqqingYuan

I have changed the word. But the problem still exists. ===== RESTART: D:\FastChat-main\FastChat-main\fastchat\model\apply_delta.py ==== Traceback (most recent call last): File "D:\FastChat-main\FastChat-main\fastchat\model\apply_delta.py", line 161, in apply_delta_low_cpu_mem( File "D:\FastChat-main\FastChat-main\fastchat\model\apply_delta.py", line 71, in apply_delta_low_cpu_mem base_tokenizer = AutoTokenizer.from_pretrained(base_model_path, use_fast=False) File "E:\Python\Python311\Lib\site-packages\transformers\models\auto\tokenization_auto.py", line 689, in from_pretrained raise ValueError( ValueError: Tokenizer class LlaMATokenizer does not exist or is not currently imported.

Burgeon0110 avatar Apr 25 '23 00:04 Burgeon0110

Upgrade transformers version, check it by from transformers import LlamaTokenizer.

haoranchen06 avatar Apr 25 '23 01:04 haoranchen06

I have changed the word. But the problem still exists. ===== RESTART: D:\FastChat-main\FastChat-main\fastchat\model\apply_delta.py ==== Traceback (most recent call last): File "D:\FastChat-main\FastChat-main\fastchat\model\apply_delta.py", line 161, in apply_delta_low_cpu_mem( File "D:\FastChat-main\FastChat-main\fastchat\model\apply_delta.py", line 71, in apply_delta_low_cpu_mem base_tokenizer = AutoTokenizer.from_pretrained(base_model_path, use_fast=False) File "E:\Python\Python311\Lib\site-packages\transformers\models\auto\tokenization_auto.py", line 689, in from_pretrained raise ValueError( ValueError: Tokenizer class LlaMATokenizer does not exist or is not currently imported.

I have changed the word. But the problem still exists. ===== RESTART: D:\FastChat-main\FastChat-main\fastchat\model\apply_delta.py ==== Traceback (most recent call last): File "D:\FastChat-main\FastChat-main\fastchat\model\apply_delta.py", line 161, in apply_delta_low_cpu_mem( File "D:\FastChat-main\FastChat-main\fastchat\model\apply_delta.py", line 71, in apply_delta_low_cpu_mem base_tokenizer = AutoTokenizer.from_pretrained(base_model_path, use_fast=False) File "E:\Python\Python311\Lib\site-packages\transformers\models\auto\tokenization_auto.py", line 689, in from_pretrained raise ValueError( ValueError: Tokenizer class LlaMATokenizer does not exist or is not currently imported.

hi guy, check your word. LlaMATokenizer is not correct.

QqqingYuan avatar Apr 25 '23 03:04 QqqingYuan

I have changed the word. But the problem still exists. ===== RESTART: D:\FastChat-main\FastChat-main\fastchat\model\apply_delta.py ==== Traceback (most recent call last): File "D:\FastChat-main\FastChat-main\fastchat\model\apply_delta.py", line 161, in apply_delta_low_cpu_mem( File "D:\FastChat-main\FastChat-main\fastchat\model\apply_delta.py", line 71, in apply_delta_low_cpu_mem base_tokenizer = AutoTokenizer.from_pretrained(base_model_path, use_fast=False) File "E:\Python\Python311\Lib\site-packages\transformers\models\auto\tokenization_auto.py", line 689, in from_pretrained raise ValueError( ValueError: Tokenizer class LlaMATokenizer does not exist or is not currently imported.

I have changed the word. But the problem still exists. ===== RESTART: D:\FastChat-main\FastChat-main\fastchat\model\apply_delta.py ==== Traceback (most recent call last): File "D:\FastChat-main\FastChat-main\fastchat\model\apply_delta.py", line 161, in apply_delta_low_cpu_mem( File "D:\FastChat-main\FastChat-main\fastchat\model\apply_delta.py", line 71, in apply_delta_low_cpu_mem base_tokenizer = AutoTokenizer.from_pretrained(base_model_path, use_fast=False) File "E:\Python\Python311\Lib\site-packages\transformers\models\auto\tokenization_auto.py", line 689, in from_pretrained raise ValueError( ValueError: Tokenizer class LlaMATokenizer does not exist or is not currently imported.

hi guy, check your word. LlaMATokenizer is not correct.

Could you give me any suggestions?

model = TFAutoModel.from_pretrained("bert-base-uncased") Downloading tf_model.h5: 0%| | 0.00/536M [01:06<?, ?B/s] Traceback (most recent call last):█▏ | 52.4M/536M [00:38<06:02, 1.34MB/s] File "E:\Python\Python311\Lib\site-packages\urllib3\response.py", line 444, in _error_catcher yield File "E:\Python\Python311\Lib\site-packages\urllib3\response.py", line 567, in read data = self._fp_read(amt) if not fp_closed else b"" ^^^^^^^^^^^^^^^^^^ File "E:\Python\Python311\Lib\site-packages\urllib3\response.py", line 533, in _fp_read return self._fp.read(amt) if amt is not None else self._fp.read() ^^^^^^^^^^^^^^^^^^ File "E:\Python\Python311\Lib\http\client.py", line 466, in read s = self.fp.read(amt) ^^^^^^^^^^^^^^^^^ File "E:\Python\Python311\Lib\socket.py", line 706, in readinto return self._sock.recv_into(b) ^^^^^^^^^^^^^^^^^^^^^^^ File "E:\Python\Python311\Lib\ssl.py", line 1278, in recv_into return self.read(nbytes, buffer) ^^^^^^^^^^^^^^^^^^^^^^^^^ File "E:\Python\Python311\Lib\ssl.py", line 1134, in read return self._sslobj.read(len, buffer) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ ssl.SSLError: [SSL: DECRYPTION_FAILED_OR_BAD_RECORD_MAC] decryption failed or bad record mac (_ssl.c:2576)

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "E:\Python\Python311\Lib\site-packages\requests\models.py", line 816, in generate yield from self.raw.stream(chunk_size, decode_content=True) File "E:\Python\Python311\Lib\site-packages\urllib3\response.py", line 628, in stream data = self.read(amt=amt, decode_content=decode_content) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "E:\Python\Python311\Lib\site-packages\urllib3\response.py", line 566, in read with self._error_catcher(): File "E:\Python\Python311\Lib\contextlib.py", line 155, in exit self.gen.throw(typ, value, traceback) File "E:\Python\Python311\Lib\site-packages\urllib3\response.py", line 455, in _error_catcher raise SSLError(e) urllib3.exceptions.SSLError: [SSL: DECRYPTION_FAILED_OR_BAD_RECORD_MAC] decryption failed or bad record mac (_ssl.c:2576)

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "", line 1, in File "E:\Python\Python311\Lib\site-packages\transformers\models\auto\auto_factory.py", line 468, in from_pretrained return model_class.from_pretrained( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "E:\Python\Python311\Lib\site-packages\transformers\modeling_tf_utils.py", line 2672, in from_pretrained resolved_archive_file = cached_file(pretrained_model_name_or_path, filename, **cached_file_kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "E:\Python\Python311\Lib\site-packages\transformers\utils\hub.py", line 409, in cached_file resolved_file = hf_hub_download( ^^^^^^^^^^^^^^^^ File "E:\Python\Python311\Lib\site-packages\huggingface_hub\utils_validators.py", line 120, in _inner_fn return fn(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^ File "E:\Python\Python311\Lib\site-packages\huggingface_hub\file_download.py", line 1332, in hf_hub_download http_get( File "E:\Python\Python311\Lib\site-packages\huggingface_hub\file_download.py", line 538, in http_get for chunk in r.iter_content(chunk_size=10 * 1024 * 1024): File "E:\Python\Python311\Lib\site-packages\requests\models.py", line 824, in generate raise RequestsSSLError(e) requests.exceptions.SSLError: [SSL: DECRYPTION_FAILED_OR_BAD_RECORD_MAC] decryption failed or bad record mac (_ssl.c:2576)

loading tf_model.h5: 10%|█████▏ | 52.4M/536M [00:53<06:02, 1.34MB/s]

Burgeon0110 avatar Apr 25 '23 07:04 Burgeon0110

It looks like failed during downloading model from huggingface_hub. you'd better download it to local and then run your program. here is the model https://huggingface.co/bert-base-uncased ? @Burgeon0110

QqqingYuan avatar Apr 25 '23 08:04 QqqingYuan

It seems that this process is dead. image

Burgeon0110 avatar Apr 25 '23 14:04 Burgeon0110

DEAD! image

Burgeon0110 avatar Apr 25 '23 14:04 Burgeon0110

Do u see this message? File "D:\FastChat-main\FastChat-main\fastchat\model\apply_delta.py", line 161, in apply_delta_low_cpu_mem( File "D:\FastChat-main\FastChat-main\fastchat\model\apply_delta.py", line 75, in apply_delta_low_cpu_mem shutil.rmtree(target_model_path) File "E:\Python\Python311\Lib\shutil.py", line 759, in rmtree return _rmtree_unsafe(path, onerror) File "E:\Python\Python311\Lib\shutil.py", line 617, in _rmtree_unsafe _rmtree_unsafe(fullname, onerror) File "E:\Python\Python311\Lib\shutil.py", line 617, in _rmtree_unsafe _rmtree_unsafe(fullname, onerror) File "E:\Python\Python311\Lib\shutil.py", line 617, in _rmtree_unsafe _rmtree_unsafe(fullname, onerror) File "E:\Python\Python311\Lib\shutil.py", line 622, in _rmtree_unsafe onerror(os.unlink, fullname, sys.exc_info()) File "E:\Python\Python311\Lib\shutil.py", line 620, in _rmtree_unsafe os.unlink(fullname) PermissionError: [WinError 5] 拒绝访问。: 'D:\vicuna-13b-delta-v0\.git\objects\01\c42d4f85d0e0eace66bd0102f49d43a49f46c9'

Burgeon0110 avatar Apr 25 '23 14:04 Burgeon0110

@Burgeon0110 this is an old issue, the transformers library fixed that a while ago. Mind if we close this one? Do you still need help with it?

surak avatar Oct 21 '23 16:10 surak

Feel free to open again if there's still issue.

infwinston avatar Oct 21 '23 17:10 infwinston