OpenLLM
OpenLLM copied to clipboard
feat(config): `dict(config)`
Feature request
openllm.LLMConfig should be serializable as a dict
config = openllm.AutoConfig.for_model('dolly-v2')
dict(config) # should be the same as config.model_dump(flatten=True)
This means within the LLMConfig class generation, it should generated the target class to be a slotted class
THis requires some work, but not very high priority atm. Feel free to pick it up
Motivation
No response
Other
No response