mlc-llm
mlc-llm copied to clipboard
[Feature Request] Change OpenAI protocol default value to NOT_GIVEN
🚀 Feature
As of now OpenAI API protocol set the direct default values. From the endpt pov, it is better to set most values to be NOT_GIVEN, as per specified, then the backend would be able to supply model specific configurations if the configuration has not been passed in from the fronend.
cc @MasterJH5574 @anibohara2000 @Celve @Hzfengsy
https://github.com/mlc-ai/mlc-llm/pull/2178
let us also confirm if it is the case for JSONFFIEngine
@tqchen PR to change this in JSONFFIEngine: https://github.com/mlc-ai/mlc-llm/pull/2225
Just to followup on the case of JSONFFIEngine. The main purpose of JSONFFIEngine is that we should avoid passing in object and parsing mlc-chat-config from FFI side. so the current implemetation that relies on parsing and passing the default config from python side is not too desirable.
Instead, we should populate these values during JSONFFIEngine.reload inside the c++ implementation. This is to help the JSONFFIEngine default behavior to be useful across backends(such as iOS). can you followup with that?
@MasterJH5574 would be good to confirm the state of this issue now in JSONFFI