unsloth icon indicating copy to clipboard operation
unsloth copied to clipboard

Any Plans To Support Solar Pro?

Open DaddyCodesAlot opened this issue 1 year ago • 6 comments

https://huggingface.co/upstage/solar-pro-preview-instruct

Solar released a new 22b model, and this thing is crazy powerful. I was just wondering if there's maybe any plans to support it in unsloth. Attempting to run it in unsloth gives the error:

RuntimeError Traceback (most recent call last)
Cell In[3], line 19
      7 # 4bit pre quantized models we support for 4x faster downloading + no OOMs.
      8 fourbit_models = [
      9 "unsloth/mistral-7b-bnb-4bit",
     10 "unsloth/mistral-7b-instruct-v0.2-bnb-4bit",
   (...)
     16 "unsloth/llama-3-8b-bnb-4bit", # [NEW] 15 Trillion token Llama-3
     17 ] # More models at https://huggingface.co/unsloth ---> 19
  model, tokenizer = FastLanguageModel.from_pretrained(
    model_name = base_model,
    max_seq_length = max_seq_length,
    dtype = dtype,
    load_in_4bit = load_in_4bit,
    token = "hf_", # use one if using gated models like meta-llama/Llama-2-7b-hf
  )

File /usr/local/lib/python3.10/dist-packages/unsloth/models/loader.py:211, in FastLanguageModel.from_pretrained(model_name, max_seq_length, dtype, load_in_4bit, token, device_map, rope_scaling, fix_tokenizer, trust_remote_code, use_gradient_checkpointing, resize_model_vocab, revision, *args, **kwargs)
    204 if "rope_scaling" in error.lower() and not SUPPORTS_LLAMA31:
    205 raise ImportError(
    206 f"Unsloth: Your transformers version of {transformers_version} does not support new RoPE scaling methods.\n"
207 f"This includes Llama 3.1. The minimum required version is 4.43.2\n"
208 f'Try pip install --upgrade "transformers>=4.43.2"\n'
209 f"to obtain the latest transformers build, then restart this session."
210 )
--> 211 raise RuntimeError(autoconfig_error or peft_error)
    212 pass
    214 # Get base model for PEFT:

RuntimeError: Loading upstage/solar-pro-preview-instruct requires you to execute the configuration file in that repo on your local machine. Make sure you have read the code there to avoid malicious use, then set the option trust_remote_code=True to remove this error.

DaddyCodesAlot avatar Sep 15 '24 17:09 DaddyCodesAlot

@DaddyCodesAlot please remove your hugging face token in the comment! We already support solar pretty sure. Not sure how different the new ones is though

shimmyshimmer avatar Sep 15 '24 18:09 shimmyshimmer

Yep I auto removed it!

danielhanchen avatar Sep 15 '24 18:09 danielhanchen

Oh wait this is a new arch - I'll check and get back to you

danielhanchen avatar Sep 15 '24 18:09 danielhanchen

Ah sorry! My mind is a bit too sleepy :(

DaddyCodesAlot avatar Sep 15 '24 18:09 DaddyCodesAlot

No worries!!

danielhanchen avatar Sep 15 '24 18:09 danielhanchen

You can still see the token, just renew it.

bet0x avatar Sep 19 '24 18:09 bet0x