h2o-llmstudio icon indicating copy to clipboard operation
h2o-llmstudio copied to clipboard

FAQ Section

Open psinger opened this issue 2 years ago • 1 comments

Would be great to have some FAQs and templates/notebooks for common questions.

  • [ ] How to generate outputs outside of LLM Studio with trained weights pushed to HF
  • [ ] How to continue training from previous experiments, how to load local weights
  • [ ] (Lack-of) backward compatibility

psinger avatar Apr 20 '23 13:04 psinger

For loading from HF

from transformers import AutoModelForCausalLM, AutoTokenizer

tokenizer = AutoTokenizer.from_pretrained("hf_path")
model = AutoModelForCausalLM.from_pretrained("hf_path")
model.half().cuda()

# need to match the input prompt how you are doing it in the LLM Studio Prompt
inputs = tokenizer("How are you?<|endoftext|>", return_tensors="pt").to("cuda")
tokens = model.generate(
  **inputs,
  max_new_tokens=64,
  temperature=0.7,
  do_sample=True,
)
print(tokenizer.decode(tokens[0], skip_special_tokens=True))

psinger avatar Apr 20 '23 13:04 psinger

Most has been already resolved.

psinger avatar Apr 28 '23 13:04 psinger