Pranav
Pranav
``` from llama_index.llms.openai_like import OpenAILike from llama_index.core.base.llms.types import ChatMessage messages = [ChatMessage(role="user", content="Hello World!")] llm = OpenAILike(model="llama-3.1-8b", api_base="http://127.0.0.1:8000/v1", is_chat_model=True, api_key="fake") res = llm.chat(messages) print(res) ``` This works @Pentar0o. You can...
You can pass context in the list of messages with all previous messages you want to include.
Hi @angeloskath , Sorry for the late response. You will have to make a couple changes to [model_io.py](https://github.com/ml-explore/mlx-examples/blob/main/stable_diffusion/stable_diffusion/model_io.py) of stable diffusion 1. [Update this line(L-175) to](https://github.com/ml-explore/mlx-examples/blob/main/stable_diffusion/stable_diffusion/model_io.py#L175) ```py model.load_weights(list(weights),strict=True) ``` 2....
Hi @Cyrilvallez , Thanks for the feedback, made the requested refactoring changes. Also, while removing the init from the modular implementation as suggested, the generated modeling code does not have...
@Cyrilvallez Thanks for the feedback, removed the pretraining TP from the configurations and added scaffolding for generation integration testing. We will add more robust integration tests and update the checkpoints...
Hi @AlexCheema, Can I work on session isolation?
Hi @AlexCheema, I DM'ed you on discord, I will also take a look at the stable diffusion bounty
Hi, I noticed this bounty issue has been open for about a month without recent activity. I'm interested in working on this task. Before I start, I wanted to check.
Hi @dhruvmalik007 , I was just checking if someone was still working on it. I will look at other issues. Thank you.
Hi @AlexCheema , Can I take this up?