api-inference-community
                                
                                 api-inference-community copied to clipboard
                                
                                    api-inference-community copied to clipboard
                            
                            
                            
                        Hosted Inference API overloaded
Hi @Narsil or anyone who can help me. I have got a problem with the api inference of my model called GeoBERT (https://huggingface.co/botryan96/GeoBERT).
It always says "Overloaded" or sometimes "Internal Server Error" every time I want to check it.
I kind of a newbie here, so please help me out. Is there a problem from my side?
Thanks
same here, e.g. deepset/roberta-base-squad2 deployed with inference api
Can you try again ? Seems better now