fast-bert icon indicating copy to clipboard operation
fast-bert copied to clipboard

Support for multi-label and multi-class text classification using DistilBERT

Open emtropyml opened this issue 5 years ago • 2 comments

How can I use DistilBERT for multi-label classification for building a fast and deploy-able model?

emtropyml avatar Sep 09 '19 12:09 emtropyml

just set the model_type parameter to distilbert and the pretrained weights name as distilbert-base-uncased

kaushaltrivedi avatar Sep 09 '19 14:09 kaushaltrivedi

@kaushaltrivedi How about for multilingual model. "DistilmBERT" (I want to run this for different languages)

Is the following config correct ? modeltype='DistilmBERT' and tokenizer='distilbert-base-multilingual-cased'

When I check databunch.tokenizer it shows me transformers.tokenization_distilbert.DistilBertTokenizer

Thanks !

mohammedayub44 avatar Apr 15 '20 10:04 mohammedayub44