P-tuning-v2
P-tuning-v2 copied to clipboard
AttributeError: No huggingface_hub attribute hf_api
(gh_P-tuning-v2) ub2004@ub2004-B85M-A0:~/llm_dev/P-tuning-v2$ bash run_script/run_rte_roberta.sh
Traceback (most recent call last):
File "run.py", line 7, in
I also encountered the same problem, how can I solve it?
后面又研究了一下,模块版本问题,更新了一下requirements.txt 文件中的datasets==2.3.2 ,问题解决。
Attempting uninstall: datasets Found existing installation: datasets 2.11.0 Uninstalling datasets-2.11.0: Successfully uninstalled datasets-2.11.0 ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts. unstructured 0.5.13 requires argilla, which is not installed. requests-oauthlib 1.3.1 requires oauthlib>=3.0.0, which is not installed. gradio 3.23.0 requires httpx, which is not installed. gradio-client 0.1.3 requires httpx, which is not installed. altair 4.2.2 requires entrypoints, which is not installed. matplotlib 3.7.1 requires numpy>=1.20, but you have numpy 1.19.2 which is incompatible. icetk 0.0.7 requires protobuf<3.19, but you have protobuf 3.20.0 which is incompatible. fairscale 0.4.13 requires numpy>=1.22.0, but you have numpy 1.19.2 which is incompatible. Successfully installed datasets-2.3.2 dill-0.3.5.1 idna-3.4 multiprocess-0.70.13 numpy-1.24.3 pandas-1.4.4 requests-2.29.0 sacremoses-0.0.53 scipy-1.9.3 seqeval-1.2.2 tokenizers-0.10.3 tqdm-4.62.3 transformers-4.11.3
[INFO|configuration_utils.py:620] 2023-04-27 19:56:28,521 >> Model config RobertaConfig { "architectures": [ "RobertaForMaskedLM" ], "attention_probs_dropout_prob": 0.1, "bos_token_id": 0, "classifier_dropout": null, "eos_token_id": 2, "hidden_act": "gelu", "hidden_dropout_prob": 0.1, "hidden_size": 1024, "initializer_range": 0.02, "intermediate_size": 4096, "layer_norm_eps": 1e-05, "max_position_embeddings": 514, "model_type": "roberta", "num_attention_heads": 16, "num_hidden_layers": 24, "pad_token_id": 1, "position_embedding_type": "absolute", "transformers_version": "4.11.3", "type_vocab_size": 1, "use_cache": true, "vocab_size": 50265 }
04/27/2023 19:56:29 - INFO - datasets.utils.file_utils - HEAD request to https://raw.githubusercontent.com/huggingface/datasets/2.3.2/datasets/super_glue/super_glue.py timed out, retrying... [0.3333333333333333]
04/27/2023 19:56:30 - INFO - datasets.utils.file_utils - HEAD request to https://raw.githubusercontent.com/huggingface/datasets/2.3.2/datasets/super_glue/super_glue.py timed out, retrying... [0.6666666666666666]
04/27/2023 19:56:31 - INFO - datasets.utils.file_utils - HEAD request to https://raw.githubusercontent.com/huggingface/datasets/2.3.2/datasets/super_glue/super_glue.py timed out, retrying... [1.0]
Traceback (most recent call last):
File "run.py", line 121, in