OneFormer
OneFormer copied to clipboard
Bad predictions from HuggingFace pretrained models
Hi, thanks for sharing this great work.
I was trying to use your model based on the HF example here. But It turned out that it predicts badly, even trying many versions of models on the training ADE20k samples.
import torch
from PIL import Image
from transformers import OneFormerProcessor, OneFormerModel
image = Image.open('ADE_train_00002024.jpg')
processor = OneFormerProcessor.from_pretrained("shi-labs/oneformer_ade20k_swin_large")
model = OneFormerModel.from_pretrained("shi-labs/oneformer_ade20k_swin_large")
inputs = processor(image, ["semantic"], return_tensors="pt")
with torch.no_grad():
outputs = model(**inputs)
mask_predictions = outputs.transformer_decoder_mask_predictions
pred = mask_predictions[0].argmax(0)
The results are very different from the Colab demo.
Is there a way to access the colab pretrained weights from the huggingface interface?
Hi @tldrafael, the HF weights are the same as in our colab demo. However, the result does seem strange. Could you share the image with me so I can try it myself? Thanks.
Sure @praeclarumjj3 , that's the image.
I'm also interested in this issue, is there a solution for it?
Also interested