text-generation-inference
text-generation-inference copied to clipboard
[DOC] Write documentation around setting the input_lenght, batch size and decode length parameters
See: #165
Hey, I am new to this repository, and fascinated by the way the inference is done. I would like to start with this issue, but I first need to get familiar with the repository/pipeline. How can I get started?.