blog icon indicating copy to clipboard operation
blog copied to clipboard

Update gptj-sagemaker.md - update token length argument for advanced inference.

Open MattWaller opened this issue 3 years ago • 0 comments

Fixing the max_length documentation. Taking the string length doesn't reflect the true stop length of max tokens. We should be taking the length of the tokens. Taking length of string would be 296 chars vs tokens 83 chars. When I was deploying my model, this was causing me timeout issues on max token generations.

MattWaller avatar Jun 10 '22 04:06 MattWaller