Loading Local Pretrained Model Through API
Congratulation for this amazing Project i have local p retrained model through https://github.com/oobabooga/text-generation-webui using their API in this format "
import requests
response = requests.post("http://127.0.0.1:7860/run/textgen", json={ "data": [ "Below is an instruction that describes a task. Write a response that appropriately completes the request.
Instruction:
Write a poem about the transformers Python library. Mention the word "large language models" in that poem.
Response:
", 200, False, 1.99, 0.18, 1, 1.15, 1, 30, 0, 0, 1, 0, 1, False, -1, ] }).json()
data = response["data"]
looking forward to be able to use local API instead of openAI one
any suggestion for the required modification that i can work on
Regards
good idea, I was looking for that funcionality for Vicuna-13b quantized model
related to #2158
This issue was closed automatically because it has been stale for 10 days with no activity.