openai-node icon indicating copy to clipboard operation
openai-node copied to clipboard

gpt-4-vision-preview does not work as expected.

Open iterprise opened this issue 1 year ago • 7 comments

Confirm this is a Node library issue and not an underlying OpenAI API issue

  • [X] This is an issue with the Node library

Describe the bug

The response from ChatGPT unexpectedly cuts off if using stream. The response via API does not match the request through chat; through the API, I only receive the beginning of the response which unexpectedly cuts off. I think this is related to the bug below." https://github.com/openai/openai-node/issues/499

To Reproduce

openai.beta.chat.completions.stream with image_url I use the following image. 1

From API I got only The instructions are asking for a modification of the SQL CREATE TABLE statement for From chat I got much more.

Code snippets

const testVision = async () => {
    const stream = await openai.beta.chat.completions.stream({
        model: 'gpt-4-vision-preview',
        messages: [
            {
                role: 'user',
                content: [{
                    type: 'image_url',
                    image_url: convertImageToDataURLSync('1.png'),
                }],
            }
        ],
        stream: true,
    });
    stream.on('content', (delta, snapshot) => {
        process.stdout.write(delta)
    });
    stream.finalChatCompletion().then( () => {
        process.stdout.write('\n')
    } );
}

OS

Linux

Node version

Node v18.16.0

Library version

openai 4.22.0

iterprise avatar Dec 15 '23 21:12 iterprise

Without stream it doesn't work as well.

iterprise avatar Dec 15 '23 21:12 iterprise

Hmm, it may be that your program is exiting because you're not waiting for the stream to complete. Try this:

const testVision = async () => {
    const stream = await openai.beta.chat.completions.stream({
        model: 'gpt-4-vision-preview',
        messages: [
            {
                role: 'user',
                content: [{
                    type: 'image_url',
                    image_url: convertImageToDataURLSync('1.png'),
                }],
            }
        ],
        stream: true,
    });
    stream.on('content', (delta, snapshot) => {
        process.stdout.write(delta)
    });
    
    await stream.finalChatCompletion();
    console.log();
}

Does that help?

rattrayalex avatar Dec 16 '23 03:12 rattrayalex

I tried this one with the same result.

const testVision = async () => {
    const stream = await openai.chat.completions.create({
        model: 'gpt-4-vision-preview',
        messages: [
            {
                role: 'user',
                content: [{
                    type: 'image_url',
                    image_url: convertImageToDataURLSync('1.png'),
                }],
            }
        ],
    });
    console.log('Not a stream', stream.choices[0].message.content);
}

as well i tested you version of my code, the same result.

iterprise avatar Dec 16 '23 03:12 iterprise

cc @logankilpatrick

On Fri, Dec 15 2023 at 10:29 PM, iterprise @.***> wrote:

I tried this one with the same result.

const testVision = async () => { const stream = await openai.chat.completions.create({ model: 'gpt-4-vision-preview', messages: [ { role: 'user', content: [{ type: 'image_url', image_url: convertImageToDataURLSync('1.png'), }], } ], }); console.log('Not a stream', stream.choices[0].message.content); }

as well i tested you version of my code, the same result.

— Reply to this email directly, view it on GitHub https://github.com/openai/openai-node/issues/573#issuecomment-1858700041, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAFL6LQM26FPFLILWZ7LOU3YJUIS7AVCNFSM6AAAAABAXA6RCKVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTQNJYG4YDAMBUGE . You are receiving this because you commented.Message ID: @.***>

rattrayalex avatar Dec 16 '23 04:12 rattrayalex

@rattrayalex Why do you remove the bag label? Do I have some problem in my code?

iterprise avatar Dec 16 '23 16:12 iterprise

It's probably a problem with the underlying API, not a bug in the Node SDK.

rattrayalex avatar Dec 19 '23 03:12 rattrayalex

Hi @iterprise, thanks for the report. We can confirm this is an issue in the API and are working on addressing it. As a workaround, you can manually set the value of max_tokens to a higher value (it is currently defaulting to 16 when it should not, which is why the responses are getting cut off).

athyuttamre avatar Dec 19 '23 07:12 athyuttamre