abhigoku10
abhigoku10
@nicolas-chaulet hads the inference pipeline done ???
@khawar512 can you pls share the code to load it from timm it will be helpful !!!
@JamesQFreeman @myt889 @Lin-Zhipeng can you share your train.py to load it from pretrained model , it will be very helpful did u try loading the varriant of the models ??
@lucidrains @Erichen911 can you share the train.py which you guys r using for custom data or any reference ??
@waleedka @philferriere any updates on this issue
@jtquisenberry how to generate for custom architectures
@YangLeiSX @raghada any update on this thread
@agunapal @ganeshmani @smellslikeml i am getting this error "UnicodeDecodeError: 'ascii' codec can't decode byte 0x93 in position 1: ordinal not in range(128)" when i run " python3 inference.py" on cpu
@smellslikeml i have not trained on my own , i am still running your example model for that only i am facing the issues
@songhengyang THnaks for the response , the current network is working for img size of 64 , can it be changed to other image size. How will be accuracy affected