Cristian Pietrobon
Cristian Pietrobon
@marcj I see, but you have to try with authentication: ```yml version: '3.1' services: mongo: image: mongo:5.0.10 restart: always ports: - 27017:27017 environment: MONGO_INITDB_ROOT_USERNAME: root MONGO_INITDB_ROOT_PASSWORD: root MONGO_INITDB_DATABASE: root-db volumes:...
Not an expert on cloudflare, but as a workaround it should be easy to write a new file (with the validation function) and import that dynamically. it's basically the same...
i had as similar problem, make sure you are running the **correct** model. Like cuda vs triton or something like that. Use cuda, delete the other.
i also have this problem, but i have installed the rocm version solved: remove the installed bitsandbytes (if any) then build/install the rocm fork
same ```bash Traceback (most recent call last): File "/home/cristian/text-generation-webui/server.py", line 869, in shared.model, shared.tokenizer = load_model(shared.model_name) File "/home/cristian/text-generation-webui/modules/models.py", line 53, in load_model model = AutoModelForCausalLM.from_pretrained(Path(f"{shared.args.model_dir}/{shared.model_name}"), low_cpu_mem_usage=True, torch_dtype=torch.bfloat16 if shared.args.bf16 else...
found [this](https://www.reddit.com/r/KoboldAI/comments/12iim4k/complete_guide_for_koboldai_and_oobabooga_4_bit/) for AMD gpus and it works!!
Same here, still broken on 1.1.4
Can someone explain to me what to do so that i can do a pr? i need this feature ASAP as using linux with wayland and gnome gives me no...
any news? this is blocking
@Geequlim i have errors in the build, probably because of the refactor on the GDScript side (fresh from godot github)