llama support
Hi, I am trying to use llama (https://huggingface.co/docs/transformers/main/model_doc/llama) with ecco. It shows not supported yet. Could you please let me know how I can make it work?
Best Regards, Srikanth
That is a great idea. Could you go over the configs here: https://github.com/jalammar/ecco/blob/main/notebooks/Identifying%20model%20configuration.ipynb
Hey @srikanthmalla , do you have any update on this? I can also help with it.
Hey @srikanthmalla , do you have any update on this? I can also help with it.
Hi, I'm also intrested in using GLM2 with ecco, do you have any advice in the config setting? Thanks
@LAKan233 @srikanthmalla : use this lib instead: https://github.com/inseq-team/inseq It's a more recent and maintained one.