openai-node
openai-node copied to clipboard
It is posibble to save chat just like chatGPT does?
Describe the feature or improvement you're requesting
I like to use the api to ask question, but you know the chatGPT will save the chat conversation, next time you ask a question in the chat, it will answer based on the chat conversations. But this api of the openai library using seems did not save the chat.
here is the test:
Here is the chatGPT did.
Additional context
I dont know why it did not save that chat as chatGPT does, because its free account or something else? I can pay for the api that can save chat and then response just like chatGPT did. I need help. Thanks.
With GPT3 you will always be limited to 4k tokens / prompt. You could sumarize previous answers and add this to the prompt.
With GPT3 you will always be limited to 4k tokens / prompt. You could sumarize previous answers and add this to the prompt.
@ezzcodeezzlife Thanks. In my bussness there are lot of information and the api need to memorize that informations. In this case the 4k tokens are too small for ur bussess is there any way to obtain more tokens? I can pay for that, I search in the openAi doc there is no way to pay for obtain more tokens.
Same requirements and pain points on my side. And I also can not give it 4k tokens, it will fail if I give it more than 2k tokens. #52 closed
do you find way to remember something ?
I had the same problem
I solved this problem. Chatgpt sends the context record together with the message. So you can send the context information together with the prompt. For an example, please refer to https://platform.openai.com/examples/default-chat. But I still have a problem, how to deal with the interface of openai when max_tokens is exceeded
I solved this problem. Chatgpt sends the context record together with the message. So you can send the context information together with the prompt. For an example, please refer to https://platform.openai.com/examples/default-chat. But I still have a problem, how to deal with the interface of openai when max_tokens is exceeded
Simply put, it is to send all the records of the current session together
For anyone interested on this problem, I just published a NPM library to solve this: https://www.npmjs.com/package/gpt3_plus. @xhacker5000 @NiKlaus-K
Some tips for how to accomplish this are available in this thread: https://github.com/openai/openai-node/issues/149
We may add convenience methods to help with this in the future.