DanielCastroBosch
DanielCastroBosch
It is possible to use a internal LLM in the same network with token provided by MS Entra ? We have the following steps: 1. Get the token - Authorization:...
@JasonWeill , did you manage to load a offline local model file? I need it too. Thanks in advance....
@JasonWeill Thanks ! I have my own llm server with a simple interface. It works good on python but It´s possible to add my own model to the jupyter-ai chat...
Hi @krassowski ! Thanks for the answer. I did like the model in the link : ``` from jupyter_ai_magics import BaseProvider from langchain_community.llms import FakeListLLM import requests import json import...