ComfyUI-Prompt-MZ
ComfyUI-Prompt-MZ copied to clipboard
基于llama.cpp的一些和提示词相关的节点,目前包括美化提示词和类似clip-interrogator的图片反推 | Use llama.cpp to assist in generating some nodes related to prompt words, including beautifying prom...
ComfyUI-Prompt-MZ
基于llama.cpp的一些和提示词相关的节点,目前包括美化提示词和类似clip-interrogator的图片反推
Use llama.cpp to assist in generating some nodes related to prompt words, including beautifying prompt words and image recognition similar to clip-interrogator
Recent changes
- [2024-05-30] 添加ImageCaptionerConfig节点来支持批量生成提示词 (Add ImageCaptionerConfig node to support batch generation of prompt words)
- [2024-05-24] 运行后在当前节点显示生成的提示词 (Display the generated prompt words in the current node after running)
- [2024-05-24] 兼容清华智谱API (Compatible with Zhipu API)
- [2024-05-24] 使用A1111权重缩放,感谢ComfyUI_ADV_CLIP_emb (Use A1111 weight scaling, thanks to ComfyUI_ADV_CLIP_emb)
- [2024-05-13] 新增OpenAI API节点 (add OpenAI API node)
- [2024-04-30] 支持自定义指令 (Support for custom instructions)
- [2024-04-30] 添加llava-v1.6-vicuna-13b (add llava-v1.6-vicuna-13b)
- [2024-04-30] 添加翻译
- [2024-04-28] 新增Phi-3-mini节点 (add Phi-3-mini node)
Installation
- Clone this repo into
custom_nodes
folder. - Restart ComfyUI.
Nodes
- CLIPTextEncode (LLamaCPP Universal)
- ModelConfigManualSelect(LLamaCPP)
- ModelConfigDownloaderSelect(LLamaCPP)
- CLIPTextEncode (ImageInterrogator)
- ModelConfigManualSelect(ImageInterrogator)
- ModelConfigDownloaderSelect(ImageInterrogator)
-
CLIPTextEncode (OpenAI API)
-
CLIPTextEncode (Phi-3)
-
CLIPTextEncode (LLama3)
-
ImageInterrogator (LLava)
Enable parameter sd_format
-
ImageCaptionerConfig
-
LLamaCPPOptions
-
CustomizeInstruct
-
BaseLLamaCPPCLIPTextEncode (可以手动传入模型路径/You can directly pass in the model path)
-
BaseLLavaImageInterrogator (可以手动传入模型路径/You can directly pass in the model path)
FAQ
moudle 'llama_cpp' has no attribute 'LLAMA_SPLIT_MODE_LAYER'
升级llama_cpp_python的版本到最新版本,前往 https://github.com/abetlen/llama-cpp-python/releases 下载安装
LLama.dll 无法加载 (Failed to load shared library LLama.dll)
CUDA版本切换到12.1,如果你使用秋叶启动器,高级设置->环境维护->安装PyTorch->选择版本中选择CUDA 12.1的版本
...llama_cpp_python-0,2.63-cp310-cp310-win_and64.whl returned nonzero exit status
保持网络畅通,该上魔法上魔法,或者手动安装llama_cpp_python
Credits
- https://github.com/comfyanonymous/ComfyUI
- https://github.com/ggerganov/llama.cpp
- https://github.com/BlenderNeko/ComfyUI_ADV_CLIP_emb
Star History
Contact
- 绿泡泡: minrszone
- Bilibili: minus_zone
- 小红书: MinusZoneAI
- 爱发电: MinusZoneAI