agentic
agentic copied to clipboard
Conversation support
Was able to make Conversation work with some trial and error.
Added a new class called Conversation
and changed some opts of sendMessage
, maybe not the best design so please give suggestions
Result:
(see src/demo-conversation.ts)
Before:

import dotenv from 'dotenv-safe'
import { oraPromise } from 'ora'
import { ChatGPTAPI } from '.'
dotenv.config()
/**
* Example CLI for testing functionality.
*
* ```
* npx tsx src/demo.ts
* ```
*/
async function main() {
const api = new ChatGPTAPI({ sessionToken: process.env.SESSION_TOKEN })
await api.ensureAuth()
const prompt = 'What is OpenAI?'
const response = await oraPromise(api.sendMessage(prompt, { conversationId: '0c382256-d267-4dd4-90e3-a01dd22c20a1' }), {
text: prompt
})
console.log(response)
const prompt2 = 'Continue'
console.log(
await oraPromise(api.sendMessage(prompt2, { conversationId: '0c382256-d267-4dd4-90e3-a01dd22c20a1' }), {
text: prompt2
})
)
console.log(
await oraPromise(api.sendMessage(prompt2, { conversationId: '0c382256-d267-4dd4-90e3-a01dd22c20a1' }), {
text: prompt2
})
)
}
main().catch((err) => {
console.error(err)
process.exit(1)
})
This would fix #29
@simon300000 Thanks so much for the PR! 🙏
Give me a bit of time to review and consider different possible APIs.
I consistently (as in, every single time) get 503s and 429s when passing a conversation and parent message ID. Quite strange.
I consistently (as in, every single time) get 503s and 429s when passing a conversation and parent message ID. Quite strange.
I get 503,504 sometimes, maybe @transitive-bullshit could mark this experimental?
Hmmm, I'm using the conversationId
and parentMessageId
in https://github.com/transitive-bullshit/chatgpt-twitter-bot and it is working consistently, though I'm not using the getConversation
wrapper; rather I'm using onConversationResponse
.
@deanylev @simon300000 I'm guessing the issue you're running into is that your client has been flagged as calling the API too fast.
I recommend adding a delay in-between requests and not having more than one request open at a time. Example: https://github.com/transitive-bullshit/chatgpt-twitter-bot/blob/9dde16f1df611d9b7968832910a2174c92e7524e/src/respond-to-new-mentions.ts#L157-L158
I'm also using onConversationResponse
. I have not had a problem when rapid firing requests without these parameters, but I will give your suggestion a shot.