youtube-ai-extension icon indicating copy to clipboard operation
youtube-ai-extension copied to clipboard

support local llm using ollama, lm-studio or any other OpenAPI-compatible local servers

Open b3nab opened this issue 6 months ago • 1 comments

hardcoded local server URL is: http//localhost:1234 - default from LM Studio

  • imported partial working code from Yhozen:feat/ai-sdk
  • add new model-context
  • extension-model ui component
  • use model context inside chat and summary
  • add new background port to auto-retrieve available models from local server
  • improve select ui adding tabs for Online and Local models

Briefly I took the work you were already doing with @Yhozen from the Yhozen:feat/ai-sdk branch and extended it to support a local server. I didn't went down the path to fully create an interface to manage settings like the url of the server (or other things like editing the prompts with proper {variables} for transcript, video title etc..) so I hardcoded it to the default server url from LM Studio.

I wanted to share this, feel free to do whatever you want with this code. 😆

Great work done with this extension @PaoloJN , this local llm feature is just the cherry on top 🍒 .

b3nab avatar Aug 06 '24 16:08 b3nab