There is a Bug in the MistralAi Library.
There is a Bug in the MistralAi Library.
File "/home/x/devika/src/llm/llm.py", line 11, in <module>
from .mistral_client import MistralAi
File "/home/x/devika/src/llm/mistral_client.py", line 2, in <module>
from mistralai.models.chat_completion import ChatMessage
ModuleNotFoundError: No module named 'mistralai.models.chat_completion'
Module installed numerous times.
I can't say that I can debug the entire Mistral AI module so it is disabled in llm.py
from .gemini_client import Gemini
# from .mistral_client import MistralAi
from .groq_client import Groq
The project then runs in standalone and in docker-compose.
However, in devika.py some of the ports are misconfigured for the front end, and it doesn't attach to the back end even when running manually as well as docker.
and
There is no sandbox attachment in standalone nor in docker.
Did we build a rocket ship to the moon and leave everyone behind?
perhaps an alternate dimension?
the world may never know
Next time If you're going to bail on a project hit that archive button so we get the alert.
well there is no actual chat_completion model, so what can be done to make it right? I tried re installing but same error
Install a older version of the lib mistralai==0.4.2
Its working fine now! I just changed the code to newer version instead! Thanks though
I've encountered the same issue in the Mistral client file and worked on a solution (updated the code to the newer version). I submitted a pull request addressing this, and it was verified five days ago. However, it hasn't been merged yet.
Same here
resolved https://github.com/stitionai/devika/pull/633
I have the bug now. And my version is the latest one 1.2.5. Anyone has any ideas?