daneel icon indicating copy to clipboard operation
daneel copied to clipboard

Add an option for a Local LLM?

Open nodered opened this issue 1 year ago • 0 comments

I'm using LM Studio https://github.com/lmstudio-ai and have had success using the following bits...

# Chat with an intelligent assistant in your terminal
from openai import OpenAI

# Point to the local server
client = OpenAI(base_url="http://localhost:1234/v1", api_key="not-needed")

history = [
    {"role": "system", "content": "You are an intelligent assistant. You always provide well-reasoned answers that \
     are both correct and helpful."},
    {"role": "user", "content": "Hello, introduce yourself to someone opening this program for the first time.\
         Be concise."},
]

while True:
    completion = client.chat.completions.create(
        model="local-model", # this field is currently unused
        messages=history,
        temperature=0.7,
        stream=True,
    )

    new_message = {"role": "assistant", "content": ""}
    
    for chunk in completion:
        if chunk.choices[0].delta.content:
            print(chunk.choices[0].delta.content, end="", flush=True)
            new_message["content"] += chunk.choices[0].delta.content

    history.append(new_message)

    print()
    history.append({"role": "user", "content": input("> ")})

nodered avatar Jan 04 '24 23:01 nodered