void
void copied to clipboard
How can I convert an OpenAI endpoint to a local VLLM endpoint?
I deployed the vllm model locally and tried to connect using the void extension, but I can't initiate a conversation. What should I do?