Tao Zhang

Results 3 comments of Tao Zhang

> I am using v1.0.0 because AFAIK, TGI supported Rope Scaling after releasing v1.0.0 and `lmsys/vicuna-13b-v1.5-16k` uses it. > > From their HF Page -> > > `Vicuna v1.5 (16k)...

> > chatglm-6b is not supported at the moment as it requires additional python dependencies. > > Is there any way to deploy chatglm through TGI? use the 0.9.1 docker...

> > > > chatglm-6b is not supported at the moment as it requires additional python dependencies. > > > > > > > > > Is there any way...