blog icon indicating copy to clipboard operation
blog copied to clipboard

Fine-Tune a Semantic Segmentation Model with a Custom Dataset colab working fine?

Open nunezmatias opened this issue 2 years ago • 0 comments

related to this colab https://colab.research.google.com/drive/1BImTyBjW3KtvHGVcjGpYYFZdRGXzM3-j?usp=sharing&hl=en&authuser=1#scrollTo=7Up9QNqOWtSD which is the same that is in the blog https://huggingface.co/blog/fine-tune-segformer

Hi. I have been trying to reproduce the example in colab, but I was not able to do yet. If I use the option

Use a dataset from the Hub

and change to hf_dataset_identifier = "segments/sidewalk-semantic"

(the dafault address does not exist anymore) Then run all the cells as they are And when running the trainning I had to add this line before otherwise there is an error repated to np import numpy as np then after starting the training it has been running the optimization for hours and hours (with GPUs, high RAM option)

and seems it would take 69 hours? image

also there is a Nan issue , not sure if it is related image

Also I could not use the other method Create your own dataset I had errors when bring back the labelled pics from Segments.ai I will reproduce them again and post it here.

nunezmatias avatar Sep 21 '22 17:09 nunezmatias