daegon Yu
daegon Yu
The code I git cloned was not the latest code, and after updating the code by git pull, it worked well when I used docker. thank you
The same error occurs in the version below. ------------------------- ------------------- ------------------------------- absl-py 2.1.0 accelerate 1.1.0 addict 2.4.0 aiobotocore 2.15.2 aiofiles 23.2.1 aiohappyeyeballs 2.4.3 aiohttp 3.11.7 aioitertools 0.12.0 aiosignal 1.3.1 altair...
I'm getting this error at the end of epoch 1. What could be the problem?
https://github.com/huggingface/transformers/issues/34702 Take a look here!
@schlechter-afk I also tried experimenting with different versions, but the error you mentioned kept appearing. I hope this error is resolved quickly.
“A lower temperature allows the logits to vary in a wider range and thus has more flexibility.” This can be interpreted as saying that embeddings make it easier to learn...
All right. I understand what you said, but why does "the cosine similarity tends to concentrate as the temperature becomes lower." Can you tell if this is happening?
This method is not a method to continue learning from the model's interrupted LR and train dataset, but a method to additionally learn from an initialized state, right?
Thank you for your reply.
If I load the Lora model from outside and train it with UnslothTrainer without get_peft_model(), I can train it with the previously generated Lora parameters. Thank you for your answer.