`rai_whoami/build_whoami.py` freqeuntly throws various exceptions when using small models (e.g. llama3.1:8b)
Describe the bug
When a small model is used (e.g. llama3.1:8b) for complex and simple tasks, the command python src/rai_whoami/rai_whoami/build_whoami.py panda/ --build-vector-db frequently (but not always) throws various exceptions. For example:
Traceback (most recent call last):
File "/home/
To Reproduce Steps to reproduce the behavior:
- Set up RAI according to the quick setup guide
- During the project configuration step (i.e., when running the 'poetry run streamlit run src/rai_core/rai/frontend/configurator.py` configuration tool) select a small model (e.g. llama3.1:8b) for both simple and complex task.
- Set up the robot's identity according to this example
- run the
python src/rai_whoami/rai_whoami/build_whoami.py panda/ --build-vector-dbcommand
Expected behavior
The src/rai_whoami/rai_whoami/build_whoami.py should not throw an exception. Instead, the user should be informed that the process of building the whomai has failed and provide a recommendation to, for example, use a more sophisticated model.
Screenshots None
Platform
- OS: Ubuntu 22.04
- ROS 2 Version Humble
- Other information: llama3.1:8b Q4_K_M was used as the model for simple task, complex task, and embedding
Version commit 0e12297a468eab719b25228f5e273877605cbdfd
Additional context None