dify icon indicating copy to clipboard operation
dify copied to clipboard

Llava chat with ollama not working

Open wrapss opened this issue 11 months ago • 4 comments

Self Checks

  • [X] I have searched for existing issues search for existing issues, including closed ones.
  • [X] I confirm that I am using English to submit this report (我已阅读并同意 Language Policy).
  • [X] Pleas do not modify this template :) and fill in all the required fields.

Dify version

9.5.8

Cloud or Self Hosted

Self Hosted (Docker)

Steps to reproduce

I have literally reproduced your tutorial: https://docs.dify.ai/tutorials/model-configuration/ollama

my chat mode setting : image

my completion mode setting: image

✔️ Expected Behavior

when i use completion mode : image

❌ Actual Behavior

When i use chat mode : image

I tried increasing the size of the model context and changing other parameters but it didn't change anything

wrapss avatar Mar 07 '24 16:03 wrapss

Hey @wrapss, great to see you back here! Hope you're doing well.

Based on the information you've provided, it seems like you're having trouble with the chat mode in Dify version 9.5.8. I understand that you've followed the tutorial and set up your model configuration as per the instructions, but you're not getting the expected behavior.

From the context provided, it appears that the chat mode and completion mode in Dify have different requirements and functionalities. The chat mode is specifically designed for conversational interactions requiring dialogue management, while the completion mode is focused on completing tasks or generating content based on inputs. Each mode has tailored input validation and error handling to suit its specific use case.

However, without the exact error message or behavior observed when trying to use the chat mode, it's challenging to pinpoint the exact issue. It would be helpful if you could provide more details about the problem you're experiencing. For instance, are you receiving any error messages? If so, what do they say? Or is the chat mode not responding at all?

In the meantime, you might want to double-check your model configuration against the specific requirements for the chat mode to function correctly in the Dify version 9.5.8. For instance, the mode should be set to chat, the context_size should be 16384, and the max_tokens parameter should be between 1 and 16384, among other settings.

If you've already checked these settings and are still experiencing issues, please provide more details about the problem, and I'll do my best to assist you further.

Sources

About Dosu

This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

If you want to continue the conversation, start your reply with @dosu-bot.

dosubot[bot] avatar Mar 07 '24 16:03 dosubot[bot]

Problem still present with llava:34b-v1.6 despite the use of an even larger context size

wrapss avatar Mar 07 '24 16:03 wrapss

Maybe try reducing the Num Predict model parameters a bit?

takatost avatar Mar 08 '24 10:03 takatost

Hello everyone, I have a problem in my environment, as you can see in the screenshot, the information is not loading on the screen. I'm using a Mac, but I deployed with Docker, cloned the repo, accessed the /docker folder and ran the docker compose up -d command

Any suggestions?

Captura de Tela 2024-03-16 às 10 17 20 Captura de Tela 2024-03-16 às 10 17 42 com/langgenius/dify/assets/6585496/cbf1c000-3d49-4664-b546-f9248b8dbe4b) Captura de Tela 2024-03-16 às 10 19 47

Regarding Ollama, I installed it, it is running and I was able to add it to Dify without any problems using my network IP and the Ollama port.

![Captura de Tela 2024-03-16 às 10 19 08](https://github. Ollama Added Successful

Also the option is not loading Captura de Tela 2024-03-16 às 10 27 08

The message error I got when I tried to save: An error occurred during credentials validation: Invalid URL 'api/generate': No scheme supplied. Perhaps you meant https://api/generate?

giovannicocco avatar Mar 16 '24 13:03 giovannicocco