Josh XT
Josh XT
Not a low priority - just trying to get through bug fixes currently. I just removed the version cap for `llama-cpp-python` so that the latest can be used. ``` pip...
Sorry, I haven't been keeping up on the llamacpp changes, but I have heard they're amazing! I mostly use OpenAI for all of my testing currently just for the simple...
Merging #431 to hopefully resolve this. Please try it out and let me know how it goes!
Working on this in #446 . If you have the API server running, you're welcome to try it.
> I tried to use server llama.cpp but without a success ... Any guide how to use it here? Don't use the `llamacppapi` one, just use the `llamacpp` one. The...
Already opted out of the telemetry with Chroma, I'll have to look into the Streamlit one unless someone wants to PR it. `Memories.py` ```python def initialize_chroma_client(self): try: return chromadb.Client( settings=chromadb.config.Settings(...
Streamlit problems will be moving to the Streamlit repo soon. I will look into ways to disable the Streamlit telemetry.
So is this good to merge @localagi ?
What a champion lol. PR open within 30 minutes of model release.