llama-index-python icon indicating copy to clipboard operation
llama-index-python copied to clipboard

Increase llm_max_tokens value

Open guygregory opened this issue 1 year ago • 0 comments

Purpose

  • LLM responses were being truncated for common prompt responses, so increasing the value from 100 to 800 tokens to reduce the chance of this happening

Does this introduce a breaking change?

[ ] Yes
[x] No

Pull Request Type

What kind of change does this Pull Request introduce?

[x] Bugfix
[ ] Feature
[ ] Code style update (formatting, local variables)
[ ] Refactoring (no functional changes, no api changes)
[ ] Documentation content changes
[ ] Other... Please describe:

How to Test

  • Get the code
git clone [repo-address]
cd [repo-name]
git checkout [branch-name]
npm install
  • Test the code

What to Check

Verify that the following are valid

  • ...

Other Information

guygregory avatar Jul 25 '24 14:07 guygregory