Prince Canuma

Results 572 comments of Prince Canuma

There is a full traceback to that error that you are not printing in your tests. I need it to undestand where the error you are getting is located. I...

> mlx version: 0.22.0.dev20250110+1ce0c0fcb Please note that you continue to use an unoffical release of mlx I would recommend you uninstall it and install the official and try again. ```...

Let me know how if the error persists after you install the official release.

You will need to install mlx-vlm from source and change the test_smoke.py file. (I will make the necessary change to display full traceback on the next release) Instead, you could:...

I see, you probably running the latest transformers. Let me try something to see if it fixes it.

I just fixed that Download a fresh copy of the model weights :)

https://huggingface.co/mlx-community/llava-v1.6-mistral-7b-8bit/commit/b8df5f329d95a7abe6429ed46093f9b84e8e6396

My pleasure! I think so I will update all models.

Hey @Huy2002-IT Thanks for opening the issue! Could you elaborate on the problem?