Mohamed Moneer
Mohamed Moneer
> @MMoneer I just [pushed a fix ](https://github.com/wandb/openui/commit/9aeff3503fe46c907ca8a6db6dfa0e4819cfe903) that should resolve this, not sure what's going on exactly but can you try pulling and seeing if you still get the...
> to make this work generically we would need a uniform way to get a list of what models are available. LMstudio server works when there is a model already...
> which is the best local LLM to run openui? @vanpelt mentioned LLava, So try one of the V1.6 7B,13B, 34B
Try -Uninstall existed " python_embeded\python.exe -m pip uninstall onnx onnxruntime onnxruntime-gpu" - Install this version "python_embeded\python.exe -m pip install onnxruntime-gpu==1.14.0 onnxruntime==1.14.0 onnx==1.14.0"
The Searge update did not work I just reinstalled it today not fixed yet Comfyui moved the prepare_mask function from sample.py to sampler_helpers.py in this commit 0542088 So the right...
@maxarch3 They merge the PR that fixes the issue just update or reinstall SeargeSDXL
> Seeing this bug in LM Studio 0.2.18 and AutoGen Studio v0.0.56 The same issue gets separated words
> @avonx I tried the propose solution of editing the dbdefaults.json file to add the line about max_tokens, also tried using the file you provided (thank you for that) but...