LiHo
LiHo
遇到一样的问题,看来是vllm的问题?
i dont think it supported, i just tried, any chance to support it? @VainF
> Thanks for reporting this. There hasn't been any activity here in quite some time, so we'll close this issue for now. If this is still a problem (using a...
hi @hiradp did you get this issue resolved?
same issue here vllm 0.7.2 with high concurrency
look like shared worker not supported in some browser 
how about torch 2.4.1 and 2.5? @strint
same issue here, @asagi4 did you figure out?
@Foul-Tarnished @lovejing0306 do you guys found any alternative for speed up flux model currently ? while waiting this new feature