Rakan
Rakan
@thinkverse - Thanks for your comments. Perhaps they did scrap the system prompt to keep the model's training simpler..? In any case, feeding in a system prompt as the user...
> Phi 3 has been updated with a system prompt as per their published tokenizer configuration. > > ``` > % ollama run phi3 > >>> /set system You are...
Maybe related to: https://github.com/ollama/ollama/issues/4076
@wuriyanto48 @CaptXiong - I'm assuming what you guys figured out requires modifying the source code and recompiling, correct? If so, can we get the project team to look at this?
> @rb81 You don't need to recompile the source code, just like this > > ```python > model_id = "microsoft/Phi-3-mini-4k-instruct" > model = AutoModelForCausalLM.from_pretrained( > model_id, > torch_dtype="auto", > trust_remote_code=True,...
I agree with @5183nischal. It would be good to select a folder, for transparency if nothing else, but also serves a practical use-case for many. I would also suggest easy...
**+1** This is very important.
**+1** Ollama and more external endpoints.