Muhammed Kalkan

Results 33 comments of Muhammed Kalkan

This was foreseen about one and a half years ago. No surprises Keeping this issue open for info to those who pays a visit

VS Code Insights supports custom LLMs with OpenAI connection types and it uses responses API therefore llama.cpp can not be used with vscode.

AS a workaround try adding this to your outlet class ` //@ts-ignore get shadowRoot(){ return this; }`