Oliver Bob Lagumen
Oliver Bob Lagumen
Do we have the docs already? I would like to expose apache, caddy or nginx, but don't know how. The introductory docs assumes frp knowledge/experience. I came here to use...
Thanks for this discussion. I have recently released [OpenBible](https://huggingface.co/models/oliverbob/openbible) finetuned with Tinyllama. Like what Gab has said, the unsloth base model (tinyllama-bnb-4bit) results are not good, or doesn't make any...
I have a question about merging: If I have saved/pushed model.push_to_hub("user/lora_model"); previously saved. Then I finetune again and If I do ``` if True: model.save_pretrained_merged("saved_model", tokenizer, save_method = "lora",) if...
> You can reuse the same code as is. Just upload your own json file containing the documents you want to train on. Make sure that in the tinystories.py you...
> Thanks. Actually I already figured out other way doing so where there are a lot of fun actually. It does not seem to be necessarily json by the way....
Tried your code on Colab. It took an hour to train. TrainOutput(global_step=3888, training_loss=1.6469270712063637, metrics={'train_runtime': 1870.5777, 'train_samples_per_second': 16.627, 'train_steps_per_second': 2.079, 'total_flos': 8380872833605632.0, 'train_loss': 1.6469270712063637, 'epoch': 1.0}) I don't know if this...
Is this already a part of the the latest oi? I didn't see it in my last build. How do I enable it?