apidash
apidash copied to clipboard
DashBot Udhay Adithya PoC
PR Description
DashBot Udhay Adithya PoC
Demo Video: dashbot_poc.mov
Checklist
- [x] I have gone through the contributing guide
- [x] I have updated my branch and synced it with project
mainbranch before making this PR - [x] I am using the latest Flutter stable branch (run
flutter upgradeand verify) - [x] I have run the tests (
flutter test) and all tests are passing
OS on which you have developed and tested the feature?
- [ ] Windows
- [x] macOS
- [ ] Linux
make sure to add video demo in this PR when the PoC is ready for review.
Hey @animator PoC is ready for review.
For now, Dashbot supports only local LLMs through Ollama and can explain API responses. I've also added an option to chat with Dashbot, which will remain during the development stage to allow free interaction and help refine the system prompt. A few issues that need to be addressed in future commits include preventing multiple instances of Dashbot from being created by pressing the floating action button, handling errors from Ollama APIs, refining the system prompt, and adding memory capability to maintain context in conversations
Hey @Udhay-Adithya, The initial implementation looks good.
Here are some improvements I had in mind, some of which you might have already considered:
- It would be nice if the chat window could be made resizable with constraints or adapted to main window resizing changes.
- Persisting LLM responses against each request: (only for the session)
- so the chat window is minimizable without clearing the response.
- so switching between active requests doesn't clear old request's messages.
- Some UI nitpicks:
- the formatting of elements in the response seems to have issue with displaying code.
- and after a response the initial template prompt buttons can be again made available.
- the layout of dashbot settings in the settings page can be improved.
Hi @DenserMeerkat ,
Thank you for taking the time to review my work and share your detailed feedback.
some of which you might have already considered
You're right — I had already considered most of your suggestions and planned to implement them.
Persisting LLM responses against each request: (only for the session)
Could you please clarify what you mean by session here? Are you referring to the duration between opening and closing API Dash, the lifetime of each request tab, or do you mean persisting chat history locally for each request?
the formatting of elements in the response seems to have issue with displaying code.
Yes, I haven’t yet focused on formatting the response content from DashBot. My priority so far has been implementing the core features, but improving the response formatting is definitely on my to-do list.
and after a response the initial template prompt buttons can be again made available
That’s a great suggestion — I’ll make sure to add that.
It would be nice if the chat window could be made resizable with constraints or adapted to main window resizing changes.
Yes, UI/UX improvements for the DashBot-related pages are definitely coming, as outlined in both my proposal and Figma design.
Many of the questions and suggestions you've raised are already addressed in my GSoC Proposal and Figma Design. The final DashBot will closely follow what’s promised there.
Thanks again for your feedback — I really appreciate it!
Could you please clarify what you mean by session here? Are you referring to the duration between opening and closing API Dash, the lifetime of each request tab, or do you mean persisting chat history locally for each request?
yes that's what I mean by a session - between opening and closing API Dash.
Closing PoC.