OpenDelta icon indicating copy to clipboard operation
OpenDelta copied to clipboard

A plug-and-play library for parameter-efficient-tuning (Delta Tuning)

Results 29 OpenDelta issues
Sort by recently updated
recently updated
newest added

在官方文档中提到,在训练时不需要改变优化器,我不是很理解,意思是原来就存在一个优化器吗,如果是的话我应该怎么获取它 ![opendelta文档](https://github.com/thunlp/OpenDelta/assets/83445004/c22f9b2b-420c-42f3-be32-73424aee04bd)

I don't understand why the reasoning effect of reloading the trained model is very poor. Did I write something wrong? Looking forward to your reply. the key code is as...

使用opendelta来微调cpmbee的10b后,使用加载lora的方式进行推理(如下所示)和原本进行推理速度相比会变慢(减慢50%),请问如何解决。 是否可以将lora与原权重进行合并。 ``` tokenizer = CPMBeeTokenizer() model = CPMBeeTorch(config=config) delta_model = LoraModel(backbone_model=model, modified_modules=["project_q", "project_v"], backend="hf") model.load_state_dict(torch.load(args.delta), strict=False) model.load_state_dict(torch.load(ckpt_path), strict=False) ```

I would like to request some clarification on the differences between OpenDelta and Peft. From my understanding, both OpenDelta and Peft have the ability to control certain parameters during training....

DeltaCenter库找不到,是怎么回事?

从报错信息看,似乎是要在setup.py中更新一下,“scikit-learn” ------ error: subprocess-exited-with-error × python setup.py egg_info did not run successfully. │ exit code: 1 ╰─> [18 lines of output] The 'sklearn' PyPI package is deprecated, use 'scikit-learn' rather...

opendalta delta model支持p-tuning v2 嘛?

``` HTTPError Traceback (most recent call last) [](https://localhost:8080/#) in () 12 from opendelta import AutoDeltaModel, AutoDeltaConfig 13 # use existing delta models from DeltaCenter ---> 14 delta = AutoDeltaModel.from_finetuned("thunlp/Spelling_Correction_T5_LRAdapter_demo", backbone_model=t5)...