Liu
Liu
+1
> +1 这个问题主要是由于txt太大,加载时间比较长的原因,希望增加分隔txt
Yeah, I think so too. I also encountered this problem
I guess because you didn't use a proxy or your IP is blocked. I solve this by using clash.
Solved this quesiton, use `AutoTokenizer.from_pretrained(model_id) `
I got the same issue
> > llama/generation.py > > ```python > class Llama: > @staticmethod > def build( > ckpt_dir: str, > tokenizer_path: str, > max_seq_len: int, > max_batch_size: int, > model_parallel_size: Optional[int] =...
Let me summarize it. It was owing to the fact that triu_tril_cuda_template was implemented for BFfloat in torch 2.1.0 and version later than that. Reference: https://github.com/huggingface/diffusers/issues/3453 So, basically you have...
By the way, after i donwloand the weight file from the google drive, my path looks in this way  But in the code you are trying to access: ...
oh, i realized where is the problem, although i extract it but in the wrong path. I noticed nothing will happen when i only run aimbotTensorflow.py and it seems only...