unilm icon indicating copy to clipboard operation
unilm copied to clipboard

Beit3 classification

Open amandaluof opened this issue 2 years ago • 1 comments

Thanks for your inspiring and effective work. You achieved really great performance on imagenet-1k classification. As the paper mentioned, you treat classification as retrieval and do intermediate retrieval on imagenet-21k before fine-tuning. But we are wondering what you do for polysemy and same class names for different classes in the dataset imagenet-21k.

Looking forward to your reply.

amandaluof avatar Jan 03 '23 11:01 amandaluof

Hi @amandaluof , we use the following code to preprocess the class names:

from nltk.corpus import wordnet as wn
syn = wn.synset_from_pos_and_offset('n', wordnet_id)
class_name = ", ".join(_.name() for _ in syn.lemmas())

addf400 avatar Jan 04 '23 05:01 addf400

The code and pre-trained models of BEiT-3 can be found at aka.ms/beit3.

donglixp avatar Mar 13 '23 13:03 donglixp