chatgpt-prompt-splitter
chatgpt-prompt-splitter copied to clipboard
ChatGPT PROMPTs Splitter. Tool for safely process chunks of up to 15,000 characters per request
Clicking the button and using the standard Vercel prompting results in: ``` This Serverless Function has crashed. Your connection is working correctly. Vercel is working correctly. 500: INTERNAL_SERVER_ERROR Code: FUNCTION_INVOCATION_FAILED...
I don't understand how do you guys run this? I get all sorts of errors, from `.env` (not included) not found to `index.html` template not found (wrong folder?).
Thanks for your great idea. I tried with calling OpenAI API and sent chunks into the conversation context. Due to the token limit, I got this error: "type": "InvalidRequestError", "message":...
Are there any plans to find a way to use this with the [openai api](https://platform.openai.com/docs/api-reference/chat), rather than the chatgpt frontend?