calpt

Results 87 comments of calpt

Hey @tanyaroosta, if you're using our `AdapterTrainer` class, it will always only save & load the adapter weights after training and never the (frozen) pre-trained weights. Beyond this saving/ loading...

Hey @amitkumarj441, thanks for working on the integration of adapters into the Transformer model, very looking forward to your contribution! To make sure the model integration is complete and working...

Hey @AmirAktify, would be great if you could implement this, there's no one from our side currently working on it. Also, thanks again for your help on the T5 implementation...

While I agree that Tensorflow support would be great to have, we currently don't have a timeline for this due to limited capacity.

Hey @tingyaohsu, thanks for reporting this. Would you mind providing a complete code snippet that reproduces the issue you mentioned? This makes it a lot easier for us to address...

@LZY-the-boys thanks for adding your findings to this. While what you describe is correct, I believe it still should be possible to use the generate() method together with Prefix Tuning...

Hey @LRY1994, this behavior is indeed introduced by a change in v3.x of the library. The recommended way of saving & loading model checkpoints using (adapter-)transformers is via the `save_pretrained()`/...

This issue should be fixed with #406 and the next adapter-transformers release (v3.1.0).

Hey @Ch-rode, your adapter adding and activation code looks right. For saving the adapter, you should call `model.bert.save_adapter("./final_adapter", "thermo_cl")` since your model is an instance of your custom class and...

> SO, to be sure, I don't need to activate a head keeping the same name of the adapter, right? Because [in this tutorial](https://github.com/adapter-hub/adapter-transformers/blob/master/notebooks/01_Adapter_Training.ipynb) it says something like that. You...