MiniGPT-4 icon indicating copy to clipboard operation
MiniGPT-4 copied to clipboard

Error:Cannot load 127.0.0.1:7860

Open zxcvbn114514 opened this issue 1 year ago • 7 comments

image everything else seem to succeeded. vram was 17g/24g

zxcvbn114514 avatar Apr 21 '23 14:04 zxcvbn114514

你的cuda问题怎么解决的呀?

oldtreee avatar Apr 21 '23 15:04 oldtreee

你的cuda问题怎么解决的呀?

回复你了,还有等出本地链接之后要用梯子加载。

zxcvbn114514 avatar Apr 22 '23 02:04 zxcvbn114514

image @zxcvbn114514 你好,我是在A100服务器上加载的,但是把服务器地址+7860就是加载不出来,用127.0.0.1:7860也是加载不出来,可以给点建议吗?我本地是有梯子,但是服务器没有

HonestyBrave avatar Apr 24 '23 13:04 HonestyBrave

本地的端口应该加载不出服务器的端口吧,这块我也不是很懂

zxcvbn114514 avatar Apr 24 '23 14:04 zxcvbn114514

@zxcvbn114514 我解决了,把服务器IP加进去就好 ,这样:demo.launch(server_name="xx.xxx.xxx.xxx", share=True, enable_queue=True)

不过这个起来还要梯子就想不通,明明是在本地

后来我上传图片,接着输入对话内容,然后报下面错,你有遇到吗? Traceback (most recent call last): File "/home/shenjh/anaconda3/lib/python3.8/site-packages/gradio/routes.py", line 394, in run_predict output = await app.get_blocks().process_api( File "/home/shenjh/anaconda3/lib/python3.8/site-packages/gradio/blocks.py", line 1075, in process_api result = await self.call_function( File "/home/shenjh/anaconda3/lib/python3.8/site-packages/gradio/blocks.py", line 884, in call_function prediction = await anyio.to_thread.run_sync( File "/home/shenjh/anaconda3/lib/python3.8/site-packages/anyio/to_thread.py", line 31, in run_sync return await get_asynclib().run_sync_in_worker_thread( File "/home/shenjh/anaconda3/lib/python3.8/site-packages/anyio/_backends/_asyncio.py", line 937, in run_sync_in_worker_thread return await future File "/home/shenjh/anaconda3/lib/python3.8/site-packages/anyio/_backends/_asyncio.py", line 867, in run result = context.run(func, *args) File "demo.py", line 93, in gradio_answer llm_message = chat.answer(conv=chat_state, img_list=img_list, max_new_tokens=1000, num_beams=num_beams, temperature=temperature)[0] File "/home/shenjh/github/MiniGPT-4/minigpt4/conversation/conversation.py", line 141, in answer outputs = self.model.llama_model.generate( File "/home/shenjh/anaconda3/lib/python3.8/site-packages/torch/autograd/grad_mode.py", line 27, in decorate_context return func(*args, **kwargs) File "/home/shenjh/anaconda3/lib/python3.8/site-packages/transformers/generation/utils.py", line 1558, in generate return self.sample( File "/home/shenjh/anaconda3/lib/python3.8/site-packages/transformers/generation/utils.py", line 2641, in sample next_tokens = torch.multinomial(probs, num_samples=1).squeeze(1) RuntimeError: probability tensor contains either inf, nan or element < 0

HonestyBrave avatar Apr 25 '23 01:04 HonestyBrave

想問一下你們有誰能跑demo.py 的時候,沒有GRADIO,直接locally 跑嗎?

ychgoaround avatar Apr 25 '23 13:04 ychgoaround

@zxcvbn114514 我解决了,把服务器IP加进去就好 ,这样:demo.launch(server_name="xx.xxx.xxx.xxx", share=True, enable_queue=True)

不过这个起来还要梯子就想不通,明明是在本地

后来我上传图片,接着输入对话内容,然后报下面错,你有遇到吗? Traceback (most recent call last): File "/home/shenjh/anaconda3/lib/python3.8/site-packages/gradio/routes.py", line 394, in run_predict output = await app.get_blocks().process_api( File "/home/shenjh/anaconda3/lib/python3.8/site-packages/gradio/blocks.py", line 1075, in process_api result = await self.call_function( File "/home/shenjh/anaconda3/lib/python3.8/site-packages/gradio/blocks.py", line 884, in call_function prediction = await anyio.to_thread.run_sync( File "/home/shenjh/anaconda3/lib/python3.8/site-packages/anyio/to_thread.py", line 31, in run_sync return await get_asynclib().run_sync_in_worker_thread( File "/home/shenjh/anaconda3/lib/python3.8/site-packages/anyio/_backends/_asyncio.py", line 937, in run_sync_in_worker_thread return await future File "/home/shenjh/anaconda3/lib/python3.8/site-packages/anyio/_backends/_asyncio.py", line 867, in run result = context.run(func, *args) File "demo.py", line 93, in gradio_answer llm_message = chat.answer(conv=chat_state, img_list=img_list, max_new_tokens=1000, num_beams=num_beams, temperature=temperature)[0] File "/home/shenjh/github/MiniGPT-4/minigpt4/conversation/conversation.py", line 141, in answer outputs = self.model.llama_model.generate( File "/home/shenjh/anaconda3/lib/python3.8/site-packages/torch/autograd/grad_mode.py", line 27, in decorate_context return func(*args, **kwargs) File "/home/shenjh/anaconda3/lib/python3.8/site-packages/transformers/generation/utils.py", line 1558, in generate return self.sample( File "/home/shenjh/anaconda3/lib/python3.8/site-packages/transformers/generation/utils.py", line 2641, in sample next_tokens = torch.multinomial(probs, num_samples=1).squeeze(1) RuntimeError: probability tensor contains either inf, nan or element < 0

请问这个问题解决了吗?我也遇到了

birchmi avatar May 04 '23 08:05 birchmi

想問一下你們有誰能跑demo.py 的時候,沒有GRADIO,直接locally 跑嗎?

I have simply implemented a localized version, for your reference, try this: https://github.com/JunnanDong/Localized_MiniGPT4.

JunnanDong avatar Jul 03 '23 07:07 JunnanDong