adapters icon indicating copy to clipboard operation
adapters copied to clipboard

Zero-shot transfer failed

Open htyeh opened this issue 2 years ago • 1 comments

Environment info

  • adapter-transformers version: 3.0.1
  • Platform: Linux-4.15.0-142-generic-x86_64-with-debian-stretch-sid
  • Python version: 3.6.13
  • PyTorch version (GPU?): 1.7.0+cu110 (True)
  • Tensorflow version (GPU?): not installed (NA)
  • Flax version (CPU?/GPU?/TPU?): not installed (NA)
  • Jax version: not installed
  • JaxLib version: not installed
  • Using GPU in script?: Yes
  • Using distributed or parallel set-up in script?: No

Details

Hi, I am using custom language adapters (config "pfeiffer") trained as described here for zero-shot cross-lingual transfer.

My Model is an mBERT with a task adapter and custom downstream layers. I train the task adapter and downstream layers by specifying:

model.bert.train_adapter(["task_adapter"])
model.bert.active_adapters = Stack("src_lang_adapter", "task_adapter")

and for zero-shot transfer:

model.bert.active_adapters = Stack("tgt_lang_adapter", "task_adapter")

However, zero-shot transfer fails completely and I get 0 as F1 score. This issue does not happen if I use pre-trained language adapters in the hub or further fine-tune the task adapter + downstream layers with target language data.

htyeh avatar Jun 17 '22 05:06 htyeh

This issue has been automatically marked as stale because it has been without activity for 90 days. This issue will be closed in 14 days unless you comment or remove the stale label.

github-actions[bot] avatar Oct 07 '22 08:10 github-actions[bot]

This issue was closed because it was stale for 14 days without any activity.

adapter-hub-bert avatar Oct 22 '22 06:10 adapter-hub-bert