mlc-llm icon indicating copy to clipboard operation
mlc-llm copied to clipboard

[Feature Request] Change OpenAI protocol default value to NOT_GIVEN

Open tqchen opened this issue 10 months ago • 6 comments

🚀 Feature

As of now OpenAI API protocol set the direct default values. From the endpt pov, it is better to set most values to be NOT_GIVEN, as per specified, then the backend would be able to supply model specific configurations if the configuration has not been passed in from the fronend.

tqchen avatar Apr 10 '24 13:04 tqchen

cc @MasterJH5574 @anibohara2000 @Celve @Hzfengsy

tqchen avatar Apr 10 '24 13:04 tqchen

https://github.com/mlc-ai/mlc-llm/pull/2178

rickzx avatar Apr 19 '24 16:04 rickzx

let us also confirm if it is the case for JSONFFIEngine

tqchen avatar Apr 20 '24 18:04 tqchen

@tqchen PR to change this in JSONFFIEngine: https://github.com/mlc-ai/mlc-llm/pull/2225

rickzx avatar Apr 25 '24 22:04 rickzx

Just to followup on the case of JSONFFIEngine. The main purpose of JSONFFIEngine is that we should avoid passing in object and parsing mlc-chat-config from FFI side. so the current implemetation that relies on parsing and passing the default config from python side is not too desirable.

Instead, we should populate these values during JSONFFIEngine.reload inside the c++ implementation. This is to help the JSONFFIEngine default behavior to be useful across backends(such as iOS). can you followup with that?

tqchen avatar Apr 27 '24 21:04 tqchen

@MasterJH5574 would be good to confirm the state of this issue now in JSONFFI

tqchen avatar May 11 '24 03:05 tqchen