Regarding the support for the MLX UI interface issue
Xinference has already supported the inference of MLX format models in the previous version ( 0.13.1 ).
After I registered a downloaded MLX model and tried to launch it from the custom UI interface, I couldn't find the MLX model engine. I hope the next version will support it on the UI.
0.13.2
Registering MLX custom model is not supported yet, we will support it ASAP.
This issue is stale because it has been open for 7 days with no activity.
This issue is stale because it has been open for 7 days with no activity.
This feature is already supported and can be closed