lollms-webui
lollms-webui copied to clipboard
Models in Zoo show up as undefined on model card mouse over and 404 on HF
Expected Behavior
View description and details of model and model card on clicking the link.
Current Behavior
All model cards are showing as undefined.
Steps to Reproduce
Please provide detailed steps to reproduce the issue.
- Go to Settings
- Go to Model Zoo
- Mouse over the "View full model card" or click the link
Possible Solution
Am I missing some environment variable, perhaps? I've been using and following the project since the early stages and this hasn't happened before.
Side note: Thank you for all your hard work, it's immensely appreciated!
Context
This happened on an upgrade/git pull and also on a fresh install today.
Screenshots
If applicable, add screenshots to help explain the issue.
oo, clearly you need to upgrade many things. You need to update the tool, as well as the zoos. maybe activate autoupdate in the main settings then reboot the app to force it to update the zoos or do it manually using git pull on the corresponding folders.
I actually have autoupdate on, and when I deactivated, restarted, reactivated and restarted again, it actually pulled some files. However, still having the same issue. On the console start up, I get:
----------------------Paths information----------------------- personal_path:/app/documents/lollms personal_configuration_path:/app/documents/lollms/configs personal_databases_path:/app/documents/lollms/databases personal_models_path:/app/documents/lollms/models personal_user_infos_path:/app/documents/lollms/user_infos personal_trainers_path:/app/documents/lollms/trainers personal_trainers_path:/app/documents/lollms/trainers/gptqlora personal_data_path:/app/documents/lollms/data
Bindings zoo found in your personal space. Pulling last personalities zoo Already up to date. Personalities zoo found in your personal space. Pulling last personalities zoo Already up to date. Extensions zoo found in your personal space. Pulling last Extensions zoo Already up to date. Qlora found in your personal space. Pulling last qlora code fatal: not a git repository (or any parent up to mount point /) <================= This is the only fail I get Stopping at filesystem boundary (GIT_DISCOVERY_ACROSS_FILESYSTEM not set).
Loading binding gptq. Please wait ... Binding gptq loaded successfully. No model selected Personality lollms mounted successfully but no model is selected Checking discussions database... ok Your personal data is stored here :/app/documents/lollms Checking for updates from /app/lollms-webui update availability: False debug mode:true Please open your browser and go to http://localhost:9600 to view the ui
Probably better if I rebuild the app from the Dockerfile. What are your thoughts on this?
Thanks for the support, much appreciated!
I'm having the same issue, but only when I select exllama as the binding. I've only tested ctransformers and exllama, but it's still an issue. Hopefully it can get fixed soon.
So I've deleted the container I was using and remade everything from scratch using the linux_install.sh. Install and setup ran fine, but still having absolutely no model description, getting the undefined link still and the resulting 404 error, of course, as the model link is wrong. Then I ran the linux_update_models.sh, had to correct the path to the bindings_zoo, and although it did git clone and copied files, models continue to present the same symptoms, not one available model.
I'm going a bit in circles here. Any suggestions?
Thanks.
Ok, so I broke through a bit. I managed to roll back from 6.5RC1 to 6.3 using the exact same method of install, and with 6.3 I can see all the models.
6.5RC1 screenshot
6.3 screenshot
So definitely there's something breaking the model links in 6.5. Unfortunately I can't really see what it is, or nothing jumps out at first sight in console, even with debug turned on.
Any suggestion where to look for, maybe more people are having the same issue as me and KoolenDasheppi.
I made another test, installed Windows, setup the whole thing up, then installed RC1 from the executable, opened up the app , closed, updated the models through .bat file, reopened the app, installed the gptq binding and when trying to get any model, hovering the link immediately shows undefined, and if clicking to insist in installing the model file, it gives a similar undefined error.
Windows screenshot
Quick update: with v6.5RC2, i can see the models again, but only 6 binding types. I presume some changes are ongoing, as in RC1 I could see gptq, exlamma, etc.
But good news is that at least with C_transformes it works now.
Thanks!
Hi there and sorry for this. I suggest you follow me on twitter or discord and activate notifications for announcements so that you keep up with the news. Few weeks ago I decided to reduce the number of bindings to the minimum because I can't maintain them all alone and they keep changing all the time. Now Hugging face replaces both gptq and exllama as it has incorporated all those functionalities so it doesn't make sense to keep them all. CTransformers is for GGML and GGUF files, it is compatible with all models which makes it the ideal candidate. I kept GPT4All because Andriy Asked me to and they do have something the others don't which is Vulkan support. So if you have an AMD GPU, gpt4all is the way to go. They are working on gguf support and they promissed to release it buy the end of next week. I'll announce when it is ready on the socials. Here is my twitter: https://twitter.com/SpaceNerduino Here is the discord: https://discord.gg/vHRwSxb5