mlx-vlm
mlx-vlm copied to clipboard
Chat UI from readme fails
I installed the project from source in a new environment and tried to launch the chat_ui using the instructions in the readme, but it fails with a parsing error. It seems to come from some internal code changes. I'm using python 3.11.11
❯ python -c "import mlx_vlm; print(mlx_vlm.__version__)"
0.1.11
❯ python -m mlx_vlm.chat_ui --model mlx-community/Qwen2-VL-2B-Instruct-4bit
Fetching 11 files: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 11/11 [00:00<00:00, 27995.96it/s]
Fetching 11 files: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 11/11 [00:00<00:00, 20717.26it/s]
Traceback (most recent call last):
File "<frozen runpy>", line 198, in _run_module_as_main
File "<frozen runpy>", line 88, in _run_code
File "/Users/andresmarafioti/Documents/SmolVLM250M/.venv/lib/python3.11/site-packages/mlx_vlm/chat_ui.py", line 26, in <module>
model, processor = load(args.model, {"trust_remote_code": True})
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/andresmarafioti/Documents/SmolVLM250M/.venv/lib/python3.11/site-packages/mlx_vlm/utils.py", line 275, in load
model = apply_lora_layers(model, adapter_path)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/andresmarafioti/Documents/SmolVLM250M/.venv/lib/python3.11/site-packages/mlx_vlm/trainer/utils.py", line 139, in apply_lora_layers
adapter_path = Path(adapter_path)
^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/Cellar/[email protected]/3.11.11/Frameworks/Python.framework/Versions/3.11/lib/python3.11/pathlib.py", line 871, in __new__
self = cls._from_parts(args)
^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/Cellar/[email protected]/3.11.11/Frameworks/Python.framework/Versions/3.11/lib/python3.11/pathlib.py", line 509, in _from_parts
drv, root, parts = self._parse_args(args)
^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/Cellar/[email protected]/3.11.11/Frameworks/Python.framework/Versions/3.11/lib/python3.11/pathlib.py", line 493, in _parse_args
a = os.fspath(a)
^^^^^^^^^^^^
TypeError: expected str, bytes or os.PathLike object, not dict
"model, processor = load(args.model, {"trust_remote_code": True})"
I removed "{"trust_remote_code": True}" in the file chat_ui.py and this error is fixed but getting other errors further ahead
Same or similar error with Python 3.10, 3.11, 3.12, mlx-vlm 0.1.14 or built from main, several (all ?) models.
This was fixed :)