openai-tokens icon indicating copy to clipboard operation
openai-tokens copied to clipboard

functions?

Open mrahmadt opened this issue 2 years ago • 5 comments

Thank you very much for such nice and simple to use package

I'm trying to test if I can use OpenAI functions but not sure if the script will keep function message if it will truncate the messages

will it never truncate functions?

mrahmadt avatar Sep 04 '23 22:09 mrahmadt

All the wrappers forward everything, so functions should work just fine. Let me know if you see otherwise

mrsteele avatar Sep 05 '23 13:09 mrsteele

After further testing, I think we need a way to remove functions from the history :)

it will eat up the tokens, and openai-tokens will not remove them

openai-tokens[truncate]: Unable to truncate any further. Prompts too large. Returning unresolvable.
Error fetching data: BadRequestError: This model's maximum context length is 4097 tokens. However, your messages resulted in 4534 tokens (4468 in the messages, 66 in the functions). Please reduce the length of the messages or functions.
    at Function.generate (file:///Users/shadow/Documents/dev/node_modules/openai/src/error.ts:59:6)
    at OpenAI.makeStatusError (file:///Users/shadow/Documents/dev/node_modules/openai/src/core.ts:381:13)
    at OpenAI.makeRequest (file:///Users/shadow/Documents/dev/node_modules/openai/src/core.ts:442:15)
    at processTicksAndRejections (node:internal/process/task_queues:95:5)
    at OpenAIClass.processMessage (/Users/shadow/Documents/dev/kernel/providers/openai.ts:208:50)
    at OpenAIClass.sendMessage (/Users/shadow/Documents/dev/kernel/providers/openai.ts:74:17)
    at Module.incomingMessage (/Users/shadow/Documents/dev/kernel/handlers/incomingMessage.ts:70:24)

mrahmadt avatar Sep 06 '23 14:09 mrahmadt

Huh, it would be nice to support. We just need to add the function blocks in the token calculations right?

mrsteele avatar Sep 06 '23 22:09 mrsteele

Huh, it would be nice to support. We just need to add the function blocks in the token calculations right?

Yes correct

Usually it's large test resulted from calling a function as requested by openai

I think after openai reads the function and response with assistant . The function in most of cases is not needed any more

But I would make this function truncate as optional - maybe other cases need it all the time


For me I just remove it immediately after openai response regardless of the size Because in my case I don't need it

mrahmadt avatar Sep 07 '23 06:09 mrahmadt

Interesting. I looked up docks and it’s passed as some sort of a system prompt trained to interpret functions, but not sure on the exact token counting.

I would love to support it, but docs on token counting for functions is too opaque unless you can find something…

mrsteele avatar Sep 09 '23 16:09 mrsteele