Alex Wang

Results 45 comments of Alex Wang

@matthieusieben Oh I understand. There is a config file storing the Ollama endpoint and it is in `%userprofile%\.aitk\models\my-models.yml` and you can search for Ollama in it and update the endpoint....

Thanks for the clarification. Now it is clear. 1. "Ollama Models" section in the Catalog page is some predefined well tested Ollama models so cannot be adjusted by users. 2....

Hi @shaneholloman, thanks for using AI Toolkit. As the error message suggests, your image file size exceeds limit (10M) for that model. Could you try with smaller image files? ![Image](https://github.com/user-attachments/assets/448fb504-47d6-4488-944f-7872ea5ea690)

I tested on Ollama 0.5.7 and it is working. It could be some other issue here. Could you try open `http://:/api/tags`, (it is usually `http://127.0.0.1:11434/api/tags` you can see the host...

Thanks for the feedback. We can confirm this bug and added to our backlog.

> Added: > > ollama list [~] NAME ID SIZE MODIFIED gpt-oss:20b aa4295ac10c3 13 GB 28 hours ago > > Hi @warm3snow, this is a different issue. I tried ollama's...

Which model provider are you using? Currently, we have DeepSeek R1 models in ONNX, GitHub, Foundry and Ollama.

We have add it to our backlog. Would you like to should what model provider do you want to use for GPT-5? Is it OpenAI's GPT-5 or GitHub's GPT-5?

> > We have add it to our backlog. Would you like to should what model provider do you want to use for GPT-5? Is it OpenAI's GPT-5 or GitHub's...

@dylbarne Currently the local inference endpoint only support local ONNX models so passing in a remote model name will cause 400 error which is by design. This is a feature...