fast-bert
fast-bert copied to clipboard
Support for multi-label and multi-class text classification using DistilBERT
How can I use DistilBERT for multi-label classification for building a fast and deploy-able model?
just set the model_type parameter to distilbert and the pretrained weights name as distilbert-base-uncased
@kaushaltrivedi How about for multilingual model. "DistilmBERT" (I want to run this for different languages)
Is the following config correct ? modeltype='DistilmBERT' and tokenizer='distilbert-base-multilingual-cased'
When I check databunch.tokenizer
it shows me transformers.tokenization_distilbert.DistilBertTokenizer
Thanks !