NextChat icon indicating copy to clipboard operation
NextChat copied to clipboard

[Bug] 视觉模式回答不完整(回答中断)

Open devyujie opened this issue 1 year ago • 20 comments

Bug Description

image

Steps to Reproduce

Expected Behavior

Screenshots

No response

Deployment Method

  • [ ] Docker
  • [ ] Vercel
  • [ ] Server

Desktop OS

No response

Desktop Browser

No response

Desktop Browser Version

No response

Smartphone Device

No response

Smartphone OS

No response

Smartphone Browser

No response

Smartphone Browser Version

No response

Additional Logs

No response

devyujie avatar Feb 20 '24 11:02 devyujie

Bot detected the issue body's language is not English, translate it automatically.


Title: [Bug] Incomplete answer in visual mode (interrupted answer)

Issues-translate-bot avatar Feb 20 '24 11:02 Issues-translate-bot

+1

QAbot-zh avatar Feb 21 '24 05:02 QAbot-zh

+1

X-Zero-L avatar Feb 21 '24 08:02 X-Zero-L

+1

boanz avatar Feb 22 '24 02:02 boanz

@devyujie 请教一下怎么使用视觉模式?在哪里输入图片呢?

jam-cc avatar Feb 23 '24 15:02 jam-cc

Bot detected the issue body's language is not English, translate it automatically.


@devyujie Please tell me how to use visual mode? Where do I enter the image?

Issues-translate-bot avatar Feb 23 '24 15:02 Issues-translate-bot

@devyujie 请教一下怎么使用视觉模式?在哪里输入图片呢?

切换成一个支持视觉的模型,比如gpt4v或者gemini-vision,就会出现上传图片的图标

QAbot-zh avatar Feb 23 '24 15:02 QAbot-zh

Bot detected the issue body's language is not English, translate it automatically.


@devyujie Please tell me how to use visual mode? Where do I enter the image?

Switch to a model that supports vision, such as gpt4v or gemini-vision, and the icon for uploading images will appear.

Issues-translate-bot avatar Feb 23 '24 15:02 Issues-translate-bot

Same

KyleJKC avatar Feb 24 '24 18:02 KyleJKC

same,hot to fix? change max_tokens to 4096,same problean

DreamsCat avatar Feb 26 '24 03:02 DreamsCat

you need to use max_tokens for gpt-4-vision-preview

example:

https://hackerchat.btz.sh/

image

H0llyW00dzZ avatar Feb 26 '24 04:02 H0llyW00dzZ

same,hot to fix? change max_tokens to 4096,same problean

it doesn't work because default on this repository has been disabled it

https://github.com/ChatGPTNextWeb/ChatGPT-Next-Web/blob/main/app/client/platforms/openai.ts#L109

H0llyW00dzZ avatar Feb 26 '24 04:02 H0llyW00dzZ

Anyways This bug can be easily fixed. However, I don't believe it will be merged into the main branch since the owner has made changes.

H0llyW00dzZ avatar Feb 26 '24 12:02 H0llyW00dzZ

Anyways This bug can be easily fixed. However, I don't believe it will be merged into the main branch since the owner has made changes.

It's really bad, the problem still reproduce even after updated to version 2.11.2.

AndyX-Net avatar Feb 27 '24 01:02 AndyX-Net

Anyways This bug can be easily fixed. However, I don't believe it will be merged into the main branch since the owner has made changes.

It's really bad, the problem still reproduce even after updated to version 2.11.2.

Yes, I understand that there's nothing particularly remarkable about the latest version. It would be more beneficial to focus on bug fixes and performance improvements, rather than adding another AI that may not be entirely stable for everyone.

H0llyW00dzZ avatar Feb 27 '24 01:02 H0llyW00dzZ

you need to use max_tokens for gpt-4-vision-preview

example:

https://hackerchat.btz.sh/

image

thank,but I don't see the use max tokens option...

DreamsCat avatar Feb 27 '24 01:02 DreamsCat

same,hot to fix? change max_tokens to 4096,same problean

it doesn't work because default on this repository has been disabled it

https://github.com/ChatGPTNextWeb/ChatGPT-Next-Web/blob/main/app/client/platforms/openai.ts#L109

okok,i got it

DreamsCat avatar Feb 27 '24 01:02 DreamsCat

Anyways This bug can be easily fixed. However, I don't believe it will be merged into the main branch since the owner has made changes.

It's really bad, the problem still reproduce even after updated to version 2.11.2.

Yes, I understand that there's nothing particularly remarkable about the latest version. It would be more beneficial to focus on bug fixes and performance improvements, rather than adding another AI that may not be entirely stable for everyone.

Agree with your point : )

AndyX-Net avatar Feb 27 '24 01:02 AndyX-Net

same,hot to fix? change max_tokens to 4096,same problean

it doesn't work because default on this repository has been disabled it

https://github.com/ChatGPTNextWeb/ChatGPT-Next-Web/blob/main/app/client/platforms/openai.ts#L109

Currently GPT4-v has a very low max_token default value, which makes the replies very short and imcomplete. Uncommenting the line and build from source again will pass the max_token value again, override the default and solve the problem.

KSnow616 avatar Feb 27 '24 07:02 KSnow616

To minimize the impact, only the Vision Model is currently configured separately for max_tokens. If you encounter additional problems, please feel free to give feedback

fred-bf avatar Feb 27 '24 09:02 fred-bf

有一个问题,就是图片太大的时候就报错了,能否上传后自动压缩图片?

induite avatar Mar 01 '24 09:03 induite

Bot detected the issue body's language is not English, translate it automatically.


There is a problem, that is, an error is reported when the image is too large. Can the image be automatically compressed after uploading?

Issues-translate-bot avatar Mar 01 '24 09:03 Issues-translate-bot