StoryDiffusion icon indicating copy to clipboard operation
StoryDiffusion copied to clipboard

打开http://0.0.0.0:7860什么也没是错误的网页

Open VXeffect opened this issue 1 year ago • 1 comments

Beautiful is better than ugly. Explicit is better than implicit. Simple is better than complex. Complex is better than complicated. Flat is better than nested. Sparse is better than dense. Readability counts. Special cases aren't special enough to break the rules. Although practicality beats purity. Errors should never pass silently. Unless explicitly silenced. In the face of ambiguity, refuse the temptation to guess. There should be one-- and preferably only one --obvious way to do it. Although that way may not be obvious at first unless you're Dutch. Now is better than never. Although never is often better than right now. If the implementation is hard to explain, it's a bad idea. If the implementation is easy to explain, it may be a good idea. Namespaces are one honking great idea -- let's do more of those! C:\Tools\MiniConda\envs\storydiff\Lib\site-packages\transformers\utils\generic.py:441: UserWarning: torch.utils._pytree._register_pytree_node is deprecated. Please use torch.utils._pytree.register_pytree_node instead. _torch_pytree._register_pytree_node( C:\Tools\MiniConda\envs\storydiff\Lib\site-packages\transformers\utils\hub.py:123: FutureWarning: Using TRANSFORMERS_CACHE is deprecated and will be removed in v5 of Transformers. Use HF_HOME instead. warnings.warn( C:\Tools\MiniConda\envs\storydiff\Lib\site-packages\transformers\utils\generic.py:309: UserWarning: torch.utils._pytree._register_pytree_node is deprecated. Please use torch.utils._pytree.register_pytree_node instead. _torch_pytree._register_pytree_node( C:\Tools\MiniConda\envs\storydiff\Lib\site-packages\transformers\utils\generic.py:309: UserWarning: torch.utils._pytree._register_pytree_node is deprecated. Please use torch.utils._pytree.register_pytree_node instead. _torch_pytree.register_pytree_node( A matching Triton is not available, some optimizations will not be enabled Traceback (most recent call last): File "C:\Tools\MiniConda\envs\storydiff\Lib\site-packages\xformers_init.py", line 55, in _is_triton_available from xformers.triton.softmax import softmax as triton_softmax # noqa ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Tools\MiniConda\envs\storydiff\Lib\site-packages\xformers\triton\softmax.py", line 11, in import triton ModuleNotFoundError: No module named 'triton' C:\Tools\MiniConda\envs\storydiff\Lib\site-packages\diffusers\utils\outputs.py:63: UserWarning: torch.utils._pytree._register_pytree_node is deprecated. Please use torch.utils._pytree.register_pytree_node instead. torch.utils._pytree._register_pytree_node( Loading pipeline components...: 100%|████████████████████████████████████████████████████| 7/7 [00:03<00:00, 2.21it/s] successsfully load paired self-attention number of the processor : 36 Running on local URL: http://0.0.0.0:7860

To create a public link, set share=True in launch(). wrong

VXeffect avatar May 08 '24 13:05 VXeffect

you should use 127.0.0.1:7860 , 0.0.0.0 means can listen on any IP of the box.

scottonly2 avatar May 09 '24 03:05 scottonly2