training-operator icon indicating copy to clipboard operation
training-operator copied to clipboard

[SDK] Use HuggingFace Data Collator for more Transformers in LLM Trainer

Open andreyvelich opened this issue 1 year ago • 6 comments

More context: https://github.com/kubeflow/training-operator/pull/2031#discussion_r1526533371. Currently, we apply HuggingFace Data Collator only for AutoModelForCausalLM Transformer in HF LLM Trainer.

We need to investigate if we should apply it for other Transformers for language modelling models.

andreyvelich avatar Mar 15 '24 20:03 andreyvelich

i am interested to contribute on this just i ping on this thread if any help is required /assign

live2awesome avatar Mar 20 '24 08:03 live2awesome

what type of transformer we are looking . i have look for given below transformer model Data Collator can be used

  1. Masked Language Model - DataCollatorforLanguageModelling with mlm=True
  2. AutoModelForSeq2SeqLM - DataCollatorForSeq2Seq
  3. AutoModelForTokenClassification - DataCollatorForTokenClassification
  4. AutoModelForSequenceClassification - simple padding is sufficient There are also option for Permutation Language Modelling and Whole word mask . Kindly suggest @andreyvelich @johnugeorge

live2awesome avatar Mar 20 '24 10:03 live2awesome

Thank you for your interest @live2awesome! It would be nice if you could let us know what changes we need to make to our HF LLM Trainer to support Data Collators for other Transformers. Also, we should discuss if we should add Data Collator by default to all supported transformers.

andreyvelich avatar Mar 28 '24 23:03 andreyvelich

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

github-actions[bot] avatar Jun 27 '24 00:06 github-actions[bot]

/remove lifecycle/stale

andreyvelich avatar Jun 27 '24 11:06 andreyvelich

/lifecycle frozen

andreyvelich avatar Jun 27 '24 11:06 andreyvelich