Results 5 issues of Benjamin Marie

When I run: `from intel_extension_for_transformers.transformers.modeling import AutoModelForCausalLM` It triggers this error: > ContextualVersionConflict: (transformers 4.35.2 (/usr/local/lib/python3.10/dist-packages), Requirement.parse('transformers==4.34.1'), {'intel-extension-for-transformers'}) It can be reproduced on Google Colab (CPU runtime). I tried to...

I installed PEFT from source. And use the latest versions of Transformers and TRL. I passed the XLoRA model to TRL but the training doesn't seem to work (training loss...

### Please check that this issue hasn't been reported before. - [X] I searched previous [Bug Reports](https://github.com/axolotl-ai-cloud/axolotl/labels/bug) didn't find any similar reports. ### Expected Behavior axolotl should be installed. ###...

bug

I expected a training configuration with per_device_train_batch_size=1 and gradient_accumulation_steps=32 to yield the same (or similar) result to per_device_train_batch_size=32 and gradient_accumulation_steps=1 but that's not the case, the former is much worse....

❓ question
⏳ needs more info

I have the following warnings: WARNING 02-08 12:23:36 scheduler.py:949] Input prompt (2011 tokens) is too long and exceeds limit of 1024 WARNING 02-08 12:23:36 scheduler.py:949] Input prompt (2011 tokens) is...

currently fixing