bikash119
bikash119
Apologies for being a noob here. Under `Deployment` category means, to provide vllm as a serverless inference service on dstack?
Ok, so under `Deployment` , we should have 2 examples shown - How to use vLLM for `service` - How to use vLLM for `task` Does this sound ok?
Thank you @peterschmidt85 for being patient with my questions.
@peterschmidt85 : I have made the required changes. Verifed by executing `mkdocs serve`
@peterschmidt85 : May I take this up?
Thank you @peterschmidt85 for giving me the heads up. Will give it a try and keep you posted on how it goes. May I request you to point to any...
Thank you @peterschmidt85.
Created a new issue #1759. Will raise PR soon.
@davidberenstein1957 : Can I take a stab at it?
hi @plaguss : Can I take this up?