gpt4free
gpt4free copied to clipboard
llama (vercel) is broken
if you try to use any of the llama models with vercel it returns
raise ValueError(f"Model are not supported: {model}")
ValueError: Model are not supported: replicate:a16z-infra/llama13b-v2-chat
weirdly on vercel the model is the same and it's fine so idk what's happening. any fixes? (btw vercel provider seems fine)
any umm updates or fixes on this?
bruh does nobody care
Bumping this issue because it has been open for 7 days with no activity. Closing automatically in 7 days unless it becomes active again.
They are many other llama provider