Xiaochang Wu
Xiaochang Wu
> were you able to run this locally? does it work? I am just looking forward to see how to update this project to support latest vllm I am working...
> Hey all, > > I also have similar updates on a fork however I've struggled to get feedback from the maintainers to work out how to proceed here. I...
> @xwu99 comment says `vllm is installed seperately from source for now` but I don't see anywhere it being installed? You just need to follow vLLM official guide.
> @xwu99 I saw `worker_use_ray=False`, is that means your implements cannot support model parallelization? I mean `world_size > 1`? vLLM for CPU does not support tensor parallelism yet. This PR...
closed the issue.
Could you separate this to different PR. one to only enable python package, another one to update scripts. don't mix them together.
please rebase to master the resolve comments.
@argentea could you rebase to master?
@argentea could you help on this?
@minmingzhu should add a PR to address this.