dify icon indicating copy to clipboard operation
dify copied to clipboard

Abnormalities in multi-round LLM conversations

Open love01211 opened this issue 10 months ago • 3 comments

Self Checks

  • [x] This is only for bug report, if you would like to ask a question, please head to Discussions.
  • [x] I have searched for existing issues search for existing issues, including closed ones.
  • [x] I confirm that I am using English to submit this report (我已阅读并同意 Language Policy).
  • [x] [FOR CHINESE USERS] 请务必使用英文提交 Issue,否则会被关闭。谢谢!:)
  • [x] Please do not modify this template :) and fill in all the required fields.

Dify version

v0.15.3

Cloud or Self Hosted

Self Hosted (Docker)

Steps to reproduce

Using vllm to launch the Deepseek-7b model,“Model Provider” - > "ADD MORE MODEL PROVIDER" -> localAI ,Then configure the workflow to upload the file, and then have multiple rounds of conversations,But use "ADD MORE MODEL PROVIDER" -> ollama,Normal operation。

✔️ Expected Behavior

Normal operation

❌ Actual Behavior

“User message content must be str” Image

DSL:

app:
  description: ''
  icon: 🤖
  icon_background: '#FFEAD5'
  mode: advanced-chat
  name: 上传文件读取
  use_icon_as_answer_icon: false
kind: app
version: 0.1.5
workflow:
  conversation_variables: []
  environment_variables: []
  features:
    file_upload:
      allowed_file_extensions: []
      allowed_file_types:
      - document
      allowed_file_upload_methods:
      - local_file
      enabled: true
      fileUploadConfig:
        audio_file_size_limit: 50
        batch_count_limit: 5
        file_size_limit: 15
        image_file_size_limit: 10
        video_file_size_limit: 100
        workflow_file_upload_limit: 10
      image:
        enabled: false
        number_limits: 3
        transfer_methods:
        - local_file
        - remote_url
      number_limits: 1
    opening_statement: ''
    retriever_resource:
      enabled: true
    sensitive_word_avoidance:
      enabled: false
    speech_to_text:
      enabled: false
    suggested_questions: []
    suggested_questions_after_answer:
      enabled: false
    text_to_speech:
      enabled: false
      language: ''
      voice: ''
  graph:
    edges:
    - data:
        sourceType: llm
        targetType: answer
      id: llm-answer
      source: llm
      sourceHandle: source
      target: answer
      targetHandle: target
      type: custom
    - data:
        isInIteration: false
        sourceType: start
        targetType: document-extractor
      id: 1738899393364-source-1738922032713-target
      source: '1738899393364'
      sourceHandle: source
      target: '1738922032713'
      targetHandle: target
      type: custom
      zIndex: 0
    - data:
        isInIteration: false
        sourceType: document-extractor
        targetType: llm
      id: 1738922032713-source-llm-target
      source: '1738922032713'
      sourceHandle: source
      target: llm
      targetHandle: target
      type: custom
      zIndex: 0
    nodes:
    - data:
        desc: ''
        selected: false
        title: 开始
        type: start
        variables: []
      height: 53
      id: '1738899393364'
      position:
        x: 30
        y: 251
      positionAbsolute:
        x: 30
        y: 251
      selected: false
      sourcePosition: right
      targetPosition: left
      type: custom
      width: 244
    - data:
        context:
          enabled: true
          variable_selector:
          - '1738922032713'
          - text
        desc: ''
        memory:
          query_prompt_template: '{{#sys.query#}}'
          role_prefix:
            assistant: ''
            user: ''
          window:
            enabled: false
            size: 10
        model:
          completion_params:
            temperature: 0.7
          mode: chat
          name: DeepSeek-R1-Distill-Qwen-7B
          provider: localai
        prompt_template:
        - id: 4a6623cd-78ba-4a4d-9478-589528dca75c
          role: system
          text: 你是一个文件助手,根据用户需要,结合文件内容回答用户,文件内容在<files>{{#context#}}</files>
        selected: false
        title: LLM
        type: llm
        variables: []
        vision:
          enabled: false
      height: 97
      id: llm
      position:
        x: 638
        y: 251
      positionAbsolute:
        x: 638
        y: 251
      selected: true
      sourcePosition: right
      targetPosition: left
      type: custom
      width: 244
    - data:
        answer: '{{#llm.text#}}'
        desc: ''
        selected: false
        title: 直接回复
        type: answer
        variables: []
      height: 102
      id: answer
      position:
        x: 942
        y: 251
      positionAbsolute:
        x: 942
        y: 251
      selected: false
      sourcePosition: right
      targetPosition: left
      type: custom
      width: 244
    - data:
        desc: ''
        is_array_file: true
        selected: false
        title: 文档提取器
        type: document-extractor
        variable_selector:
        - sys
        - files
      height: 91
      id: '1738922032713'
      position:
        x: 332.57142857142856
        y: 251
      positionAbsolute:
        x: 332.57142857142856
        y: 251
      selected: false
      sourcePosition: right
      targetPosition: left
      type: custom
      width: 244
    viewport:
      x: 77.86710032646181
      y: 174.9890744049676
      zoom: 0.6093853943072869

love01211 avatar Feb 17 '25 09:02 love01211

@love01211 Hello there! I'm here to help you with bug reports, answer your questions, and assist you in becoming a contributor. Let's work together to solve this issue.

I found a similar unsolved discussion regarding issues with continuous dialogue when uploading files in Dify. You can check it out here [1].

To resolve the "User message content must be str" error when using the Deepseek-7b model with vllm in a self-hosted Dify setup, ensure that the content of UserPromptMessage is always a string. In the code, the _convert_prompt_messages_to_dicts method handles the conversion of UserPromptMessage content. Make sure that when creating UserPromptMessage, the content attribute is a string. If the content is not a string, convert it to a string format before passing it to the model. If the content is a list of PromptMessageContentType, ensure it is properly handled and converted to a suitable format that the model can process [2].

To continue talking to Dosu, mention @dosu.


Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Bug Report | Other

dosubot[bot] avatar Feb 17 '25 09:02 dosubot[bot]

Help me.

love01211 avatar Feb 24 '25 02:02 love01211

Hi, @love01211. I'm Dosu, and I'm helping the Dify team manage their backlog. I'm marking this issue as stale.

Issue Summary:

  • You reported an error with the Deepseek-7b model in a self-hosted environment using vllm: "User message content must be str."
  • The issue does not occur with the "ollama" model provider.
  • I suggested ensuring UserPromptMessage content is a string and shared a related discussion link.
  • Both you and another user, yxdfighting, indicated that the suggestion did not resolve the issue.

Next Steps:

  • Please confirm if this issue is still relevant with the latest version of the Dify repository. If so, you can keep the discussion open by commenting here.
  • If there is no further activity, this issue will be automatically closed in 15 days.

Thank you for your understanding and contribution!

dosubot[bot] avatar Mar 27 '25 16:03 dosubot[bot]