web-llm
web-llm copied to clipboard
Support StableLM
I'm curious if you guys will provide StableLM capability on the web-llm? It would be really great if so.
I didn't try the 3b yet, but if the result are not too bad it could maybe be a solution for the people who not have enough memory.
it seems interesting to get StableLM in, but it's not at the top priority among the models we are going to support. Our next model to support is Dolly, and we will discuss whether to support StableLM after that.
it seems interesting to get StableLM in, but it's not at the top priority among the models we are going to support. Our next model to support is Dolly, and we will discuss whether to support StableLM after that.
Can share the whole process from scratch of how to convert Dolly modle origin file to webllm, then we can convert stablelm and others by self?
it seems interesting to get StableLM in, but it's not at the top priority among the models we are going to support. Our next model to support is Dolly, and we will discuss whether to support StableLM after that.
Can share the whole process from scratch of how to convert Dolly modle origin file to webllm, then we can convert stablelm and others by self?
I think this is the usual approach for the opensource environment. Supporting model by your team one by one without any instruction sounds like this repository is not for open source. For now, I will move on.
Thank you for discussions and interest. We are in the process of bringing in new models, and with them also have open recipes in the PRs so people can try to bring up and contribute new models to the repo. We can see there are PRs for dolly and chatglm which could serve as a reference. You are more than welcome to open PRs for StableLM as well
Considering StableVicuna's release, i'd be greatly appreciated if the 13b model was supported. Your software runs swimmingly on Intel Arc and it's one of the few repos that actually can utilize it in this manner.