Plush-for-ComfyUI icon indicating copy to clipboard operation
Plush-for-ComfyUI copied to clipboard

Does it support Ollama?

Open Philmar1986 opened this issue 1 year ago • 4 comments

Does it support Ollama?

Philmar1986 avatar May 19 '24 05:05 Philmar1986

I haven't left tested it with Ollama, but from their documentation it looks like it might work (I assume you're taking about Advanced Prompt Enhancer).
Try using this URL: http://localhost:11434/api ( or use a different port if you've setup something other than 11434) with either the http Post or http Post Simplified Data AI Services modes. No guarantees, but worth a try

glibsonoran avatar May 19 '24 05:05 glibsonoran

I just tried as you said, it shows like below:

Error occurred when executing AdvPromptEnhancer:

string indices must be integers

File "K:\ComfyUI-aki-v1.3\execution.py", line 151, in recursive_execute output_data, output_ui = get_output_data(obj, input_data_all) File "K:\ComfyUI-aki-v1.3\execution.py", line 81, in get_output_data return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True) File "K:\ComfyUI-aki-v1.3\execution.py", line 74, in map_node_over_list results.append(getattr(obj, func)(**slice_dict(input_data_all, i))) File "K:\ComfyUI-aki-v1.3\custom_nodes\Plush-for-ComfyUI\style_prompt.py", line 696, in gogo llm_result = self.ctx.execute_request(**kwargs) File "K:\ComfyUI-aki-v1.3\custom_nodes\Plush-for-ComfyUI\api_requests.py", line 721, in execute_request return self._request.request_completion(**kwargs) File "K:\ComfyUI-aki-v1.3\custom_nodes\Plush-for-ComfyUI\api_requests.py", line 191, in request_completion self.j_mngr.log_events(f"Server STATUS error {e.status_code}: {e.body['message'] if e.body else ''}. File may be too large.",

Philmar1986 avatar May 19 '24 07:05 Philmar1986

That error was from the Local App (URL) AI_Service. Did you try using both: OpenAI compatible http POST and http POST Simplified Data?
I don't know if this will actually work, my node is setup for communicating with apps that comply with the OpenAI API standards and Ollama doesn't seem to do that. But if anything is going to work, it will be one of those to methods.

glibsonoran avatar May 19 '24 16:05 glibsonoran

Looking again that documentation, you might also try these urls: `

  • http://localhost:11434/api/generate -d
  • http://localhost:11434/api/chat -d In either case though they need to be used with the OpenAI compatible http POST and http POST Simplified Data AI_service methods.

glibsonoran avatar May 19 '24 16:05 glibsonoran

Did it work for you Philmar?

glibsonoran avatar May 23 '24 21:05 glibsonoran

CLOSED

glibsonoran avatar May 26 '24 17:05 glibsonoran