loeken
loeken
@deece it now uses HEAD, updated it to work with the new changes ( https://github.com/oobabooga/text-generation-webui/wiki/LLaMA-model#4-bit-mode )
@MarlinMr mapped the extensions folder ( and a few more others, in the docker-compose )
@MarlinMr running pip3 installs for the extensions too now, using the same caching as with the others, also added port 5000 for the api via docker-compose
@oobabooga mind merging this? would make it easier to hop branches and test in docker
yeah this PR has turned a bit into a mess i ll close this one and create a new clean one
https://github.com/oobabooga/text-generation-webui/pull/633
it seems that the docs use a tag of 1.8.0/1.9.0 but the actual image is tagged with v1.8.0 and v1.9.0 kinda confusing ;)
its been half a year, any progress on this?