Complementing the answer with each iteration
How about making the output of the ChatGPT response "dynamic" so that the ask function can be placed in a for in loop and output the augmented ChatGPT response on each iteration. I suppose, I think this can be done through yield with sending get requests indicating message_id
Could you give more details? I don't really understand what you're proposing.
As for streaming responses, I will need to modify the tls_client to support it
That's what I mean https://user-images.githubusercontent.com/61879874/209655208-ec73eb70-afd7-4b15-9974-f29de8d33233.mp4
Ah. That was possible with the old library. I am still working to bring streaming support to tls_client. Need to port a lot of the requests library
Thanks for your work, I'll be waiting
Spawning browser... Traceback (most recent call last): File "/data/chatgpt/bin/revChatGPT/ChatGPT.py", line 404, in get_cf_cookies browser_executable_path=self.config.get("browser_exec_path") File "/usr/local/lib/python3.6/site-packages/undetected_chromedriver/init.py", line 51, in new instance.init(*args, **kwargs) TypeError: init() got an unexpected keyword argument 'enable_cdp_events'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "chatgpt_app.py", line 10, in
@Liu-Angelo Open a new issue.
The streaming-mode in the old library is very useful ! And I look forward to the mode in this library~
Might not be possible. Too much work to rewrite the requests library in a different language
This would be really interesting. Did you consider using fastAPI instead of flask? It has built in HTTP streaming support
It won't work either way because the base library (tls_client) doesn't support streaming