ComfyUI_VLM_nodes icon indicating copy to clipboard operation
ComfyUI_VLM_nodes copied to clipboard

Is it possible to combine both branches (main + mac) into the one?

Open bigcat88 opened this issue 1 year ago • 2 comments
trafficstars

This would greatly simplify the use of these wonderful Nodes.

Whether a node is running on a Mac or not is very easy to determine. Whether the Python package for AMD is installed is also a 2-line check.

Or is I missing something obvious?

bigcat88 avatar May 22 '24 16:05 bigcat88

autogptq library is not working on mac devices which internlm model was requires it thats why i seperated them. also there was a problem about the pycpuinfo package that needs a lower version for windows and linux to load faster and newer version for mac. i can delete the internlm node and autogptq dependency, since there are a lot of vlm models released after that model. we can combine them i think.

gokayfem avatar May 22 '24 17:05 gokayfem

also there was a problem about the pycpuinfo package that needs a lower version for windows and linux to load faster and newer version for mac.

this can be done in the requirements.txt (specify separate versions of pycpuinfo for Mac and linux, i mean)

i can delete the internlm node and autogptq dependency, since there are a lot of vlm models released after that model. we can combine them i think.

that will be awesome, as there are not so much working local vlm nodes except these repo :)

bigcat88 avatar May 22 '24 17:05 bigcat88