MiniCPM-V icon indicating copy to clipboard operation
MiniCPM-V copied to clipboard

Implement chatbot functionality using Streamlit

Open JamePeng opened this issue 1 year ago • 4 comments

This commit adds the implementation of a chatbot using Streamlit, a Python library for building interactive web applications. The chatbot allows users to interact with an AI assistant, asking questions and receiving responses in real-time.

Features include:

  • Integration with the MiniCPM-V-2.0 model for generating responses.
  • User-friendly interface with text input for questions and options for uploading images.
  • Sidebar settings for adjusting parameters such as max_length, top_p, and temperature.
  • Ability to clear chat history to start a new conversation.

The chat history and session state are managed using Streamlit's session_state functionality, ensuring a seamless user experience across interactions.

This implementation provides a simple and intuitive way for users to engage with the chatbot, making it accessible for various use cases.

JamePeng avatar Apr 17 '24 16:04 JamePeng

Update MiniCPM-Llama3-V-2_5 streamlit demo

JamePeng avatar May 20 '24 18:05 JamePeng

Thanks for your contribution,we will review this PR soon.

iceflame89 avatar May 21 '24 13:05 iceflame89

Thanks for your contribution. There are several questions:

  • It seems that multi-turn conversation is not implemented, because I noticed that the answers are not added to history msgs?
  • Sampling decoding also supports top_k and repetition_penalty. You can add these parameters in sidebar. ref
  • v2.5 now supports streaming output, can this be added? Only add stream=True in model.chat. ref

YuzaChongyi avatar May 27 '24 02:05 YuzaChongyi

Done! This update, based on the May 25, 2024 version of modeling_minicpmv.py, includes the following enhancements:

  1. Introduction of repetition_penalty and top_k parameters to the st.sidebar, enabling users to adjust these model parameters dynamically.
  2. Default support for stream=True in the model.chat method to facilitate real-time streaming responses.

Additionally, I feel that multi-turn conversations may not be very important. The chat_history array is currently used only to display the conversation history on the Streamlit interface. Essentially, the code can maintain continuous interaction based on the last image. The historical text from multi-turn conversations might interfere with the model's ability to express information accurately for various images shown before and after.

JamePeng avatar May 27 '24 16:05 JamePeng