llm-workflow-engine
llm-workflow-engine copied to clipboard
Is it theoretically possible to pass ChatGPT credentials to the wrapper without relying on playwright, a planned future feature?
From what I gather, there probably isn't a truly accessible API for chatgpt -- probably unless you partner with them like Microsoft, but I wonder if its just a matter of time before you all figure out how to pass a token or credentials (forgive me my knowledge in the subject is limited) to the wrapper instead of having to rely on GUI-automation tools like playwright.
I apologize for posting a question on the issues section if this is not the appropriate place, not sure where else to ask. I'm working on a personal side project using this wrapper, and have become more eager to share it as a service. The only thing keeping me from doing so is the limitation of how you have to log in to ChatGPT at this time.
Somebody please correct me if I'm wrong, but there have been other ChatGPT API projects that were calling the API directly, and these broke after some recent changes by OpenAI to the unofficial ChatGPT API endpoints. Several of those projects simply gave up and archived their code.
One of the reasons that this project still works is that it's using a real browser in the background, and making these API calls (from OpenAI's point of view) from the official ChatGPT website.
I did see another project that supported sending API requests to a proxy server, which I'm guessing ran a browser on its end, and that's probably faster. But it's also probably a lot harder to set up and maintain.
IMO, until/unless there's an official ChatGPT API, the approach that this project uses is the simplest and most stable available, despite being slower.
If an official API is published/supported by OpenAI, then using Playwright would be obviated, we'd need to rip it out and make some adjustments (most importantly converting to whatever their supported auth scheme would be) to keep it viable.
Please excuse me if what I'm mentioning doesn't help here, but I know a project that calls ChatGPT through a given API key - and since I used it just today I don't beleive it broke so far. Here's the link, maybe it helps: https://github.com/yonashailug/chatgpt-cli
Hey @fcnjd thank you for the link. This was in fact the first project I ran into originally, but realized it is actually not for chatgpt, but openAI's gpt3 service in which you pay per token. It works well enough, but I don't think it's exactly the same as chatgpt.
My understanding is that ChatGPT runs GPT 3.5
(whatever that means), and the published OpenAI endpoints use GPT 3
.
I'm not 100% sure if that means ChatGPT is more capable, but in my brief tests, it sure looks to be.
My understanding is that ChatGPT runs
GPT 3.5
(whatever that means), and the published OpenAI endpoints useGPT 3
.I'm not 100% sure if that means ChatGPT is more capable, but in my brief tests, it sure looks to be.
That is correct, the public OpenAI callable model from the API (latest one) is the text-davinci-003 which is DIFFERENT from ChatGPT.(If I'm not mistaken it is missing the RLHF part and the initial prompt that ChatGPT is given) I've read that they intend to publish ChatGPT as an API as well in the future.
Hello, thanks for pointing me in the right direction here, I didn't mind the difference between GPT-3 and 3.5 previosly. Now I learned a bit more about it. Anyway, it looks like now the API is released: https://techcrunch.com/2023/03/01/openai-launches-an-api-for-chatgpt-plus-dedicated-capacity-for-enterprise-customers/
Hello, thanks for pointing me in the right direction here, I didn't mind the difference between GPT-3 and 3.5 previosly. Now I learned a bit more about it. Anyway, it looks like now the API is released: https://techcrunch.com/2023/03/01/openai-launches-an-api-for-chatgpt-plus-dedicated-capacity-for-enterprise-customers/
Nice find! Something I think would be a valuable feature to add for this wrapper!
Browser backend is deprecated, no new features specific to the backed will be implemented.