ChatGPT-Linebot-using-python-flask-on-vercel
ChatGPT-Linebot-using-python-flask-on-vercel copied to clipboard
TODO: support longer response
trafficstars
Currently, since we use max_token = 240. Some sentences will be cut due to the max length. However, user can say "繼續" to continue the response.
Try to enhance this, there are many methods that can enhance this.
- Reminder 1: Take care of unlimited responses. For example, use asks chatbot to respond in over 1000000 words. That I think it will be calculated in a REALLY LONG time.
- Reminder 2: Vercel (Free account) will timeout after 10 secs no response. Take care of this.