airllm
airllm copied to clipboard
error in apple mac m3
import mlx.core as mx generation_output = model.generate( mx.array(input_tokens['input_ids']), max_new_tokens=3, use_cache=True, return_dict_in_generate=True)
print(generation_output)
error : [load] Input must be a file-like object opened in binary mode, or string
duplicated, see https://github.com/lyogavin/Anima/issues/116
mx.load(str(to_load_path))
Help where i have to put above code..... input_text = [ #'What is the capital of United States?', 'I like', ]
MAX_LENGTH = 128 input_tokens = model.tokenizer(input_text, return_tensors="np", return_attention_mask=False, truncation=True, max_length=MAX_LENGTH, padding=False)
input_tokens
generation_output = model.generate( mx.array(input_tokens['input_ids']), max_new_tokens=3, use_cache=True, return_dict_in_generate=True)
print(generation_output)
did this get solved?
since the old page is down I was able to access its cached version, for anyone having the same issue the fix as suggested by @Verdagon is
In mlx_model_persister.py, change:
mx.load(to_load_path)
to:
mx.load(str(to_load_path))
I try modify my airllm package . it can work !! Thanks /Users/{user-name}/miniconda3/envs/{env-name}/lib/python3.11/site-packages/airllm/persist/mlx_model_persister.py