Page not receiving data
Describe the bug
Opening http://127.0.0.1:7860 after running the server.py results in:
This page isn’t working 127.0.0.1 didn’t send any data. ERR_EMPTY_RESPONSE
Sometimes also:
This site can’t be reachedThe web page at http://127.0.0.1:7860/ might be temporarily down or it may have moved permanently to a new web address. ERR_ADDRESS_IN_USE
Is there an existing issue for this?
- [X] I have searched the existing issues
Reproduction
- Install textgen for WSL
- Do the dirty fix
- Setup GTPQ and download the models
- Run the server with:
python server.py --model llama-13b-4bit-128g --wbits 4 --groupsize 128
Screenshot
No response
Logs
`(textgen) angdev@7DEV-DESKTOP:/mnt/c/users/vi7or/documents/text-generation-webui$ python server.py --model llama-13b-4bit-128g --wbits 4 --groupsize 128
===================================BUG REPORT===================================
Welcome to bitsandbytes. For bug reports, please submit your error trace to: https://github.com/TimDettmers/bitsandbytes/issues
================================================================================
CUDA SETUP: CUDA runtime path found: /home/angdev/miniconda3/envs/textgen/lib/libcudart.so
CUDA SETUP: Highest compute capability among GPUs detected: 8.6
CUDA SETUP: Detected CUDA version 117
CUDA SETUP: Loading binary /home/angdev/miniconda3/envs/textgen/lib/python3.10/site-packages/bitsandbytes/libbitsandbytes_cuda117.so...
Loading llama-13b-4bit-128g...
Loading model ...
/home/angdev/miniconda3/envs/textgen/lib/python3.10/site-packages/safetensors/torch.py:99: UserWarning: TypedStorage is deprecated. It will be removed in the future and UntypedStorage will be the only storage class. This should only matter to you if you are using storages directly. To access UntypedStorage directly, use tensor.untyped_storage() instead of tensor.storage()
with safe_open(filename, framework="pt", device=device) as f:
/home/angdev/miniconda3/envs/textgen/lib/python3.10/site-packages/torch/_utils.py:776: UserWarning: TypedStorage is deprecated. It will be removed in the future and UntypedStorage will be the only storage class. This should only matter to you if you are using storages directly. To access UntypedStorage directly, use tensor.untyped_storage() instead of tensor.storage()
return self.fget.__get__(instance, owner)()
/home/angdev/miniconda3/envs/textgen/lib/python3.10/site-packages/torch/storage.py:899: UserWarning: TypedStorage is deprecated. It will be removed in the future and UntypedStorage will be the only storage class. This should only matter to you if you are using storages directly. To access UntypedStorage directly, use tensor.untyped_storage() instead of tensor.storage()
storage = cls(wrap_storage=untyped_storage)
Done.
Loaded the model in 65.96 seconds.
Running on local URL: http://127.0.0.1:7860
To create a public link, set `share=True` in `launch()`.`
System Info
Windows 10 Pro 64-bit (10.0, Build 19045)
Intel Core i9-10900k CPU
64gb Memory
Nvidia Geforce RTX 3090
Using WSL to run textgen
follow https://github.com/oobabooga/text-generation-webui/wiki/WSL-installation-guide#bonus-port-forwarding
follow https://github.com/oobabooga/text-generation-webui/wiki/WSL-installation-guide#bonus-port-forwarding
I followed that section of the installation addressing port forwarding. Despite adhering to the instructions for setting up port forwarding, the problem persisted. I restarted my PC to see if it would work, however, the issue remained unresolved. I proceeded to run the port forwarding command again and restarted my PC once more, but the problem still could not be fixed
Same thing is happening to me, also not working after rerunning the command and restarting. Weirdly it did allow me to connect briefly for like 30 seconds when I first ran it (I did not run the powershell command initially), then it kicked me out and I haven't been able to connect since.
Using --listen-port with 7861 (or any other port other than 7860) fixed this for me. Still not sure what's causing the default port to not work though.
Using --listen-port with 7861 (or any other port other than 7860) fixed this for me. Still not sure what's causing the default port to not work though.
Worked for me too. Thank you 😁
Has anyone been able to determine why this fix fixes it? Does it work only if you followed the port forward instructions or even without them?
Same for me! This is a very weird bug, but the workaround seems to function!
I finally got it run..
get wsl2 ip
ip a
now. i have the wls2 ip
172.24.247.21
start textgen server with command
python server.py --listen --listen-host 0.0.0.0 --listen-port 7860
set port formward in windows
for webui
netsh interface portproxy add v4tov4 listenport=7860 listenaddress=0.0.0.0 connectport=7860 connectaddress=172.24.247.21
for api
netsh interface portproxy add v4tov4 listenport=5000 listenaddress=0.0.0.0 connectport=5000 connectaddress=172.24.247.21
netsh interface portproxy add v4tov4 listenport=5005 listenaddress=0.0.0.0 connectport=5005 connectaddress=172.24.247.21
start textgen server with command
python server.py --listen --listen-host 0.0.0.0 --listen-port 7860
This command worked for me. Thank you!
The key is to set the --listen-host to 0.0.0.0
This issue has been closed due to inactivity for 6 weeks. If you believe it is still relevant, please leave a comment below. You can tag a developer in your comment.