KoboldAI-Client
KoboldAI-Client copied to clipboard
libmamba failed to execute pre/post link script for cudatoolkit
When I tried to build docker-cuda image I get the following error:
0 122.2
#0 122.2 error libmamba response code: -1 error message: Invalid argument
#0 122.2 critical libmamba failed to execute pre/post link script for cudatoolkit
------
failed to solve: process "/usr/local/bin/_dockerfile_shell.sh micromamba install -y -n base -f /home/micromamba/env.yml" did not complete successfully: exit code: 1
OS: Archlinux Nvidia driver version: 530.41.03
P.S. oobabooga/text-generation-webui and AUTOMATIC1111/stable-diffusion-webui work fine in docker on my system.
I also get this error with:
OS: NixOS Nvidia Driver Version: 545.29.06
I found a related issue:
https://github.com/mamba-org/micromamba-docker/issues/368
But it was closed in favor of:
https://github.com/mamba-org/mamba/issues/2501
It recommends workaround:
docker build --ulimit nofile=262144:262144 ...
So I'm trying:
modified docker-cuda/docker-compose.yml
@@ -2,6 +2,10 @@ version: "3.2"
services:
koboldai:
build: .
+ ulimits:
+ nofile:
+ soft: 262144
+ hard: 262144
environment:
- DISPLAY=${DISPLAY}
network_mode: "host"
And re-running now.
That didn't fix it, same error.